We provide IT Staff Augmentation Services!

Sr. Big Data Developer Resume

5.00/5 (Submit Your Rating)

Roseland New, JerseY

SUMMARY:

  • 13 years of IT experience in Software Development for various industry domains that include Financial/Investment Banking, HR and Payroll, Health Insurance and Telecom.
  • Cloudera Certified Hadoop Developer (CCD 410) with proficiency in developing and implementing Hadoop ecosystem components to work with Big Data.
  • Certified Java Programmer & Web Component Developer with hands on experience in implementing the Object Oriented Programming concepts and methodologies.
  • Hands on experience in developing, implementing and configuring Hadoop ecosystem: CDH 5, HDFS, Spark batch, Spark Streaming, MapReduce, YARN, AVRO, Hive, Pig, HBase, Oozie, Sqoop, Flume.
  • Expertise in design and development of Spark batch, Spark streaming in Java to process and analyze large volume of data.
  • Experience in configuring Kafka brokers, topics, partitions and replication factor on multi - node cluster.
  • Designed and developed MapReduce programs, Oozie workflows/coordinator jobs, Hive queries, and Pig scripts to preprocess and analyze large volume of data sets.
  • Experience in Sqoop import and export data sets from RDBMS systems/mainframe to HDFS and vice-versa.
  • Expertise on Hadoop architecture and daemons such as Job Tracker, Task Trackers, YARN Resource Manager, Application Master and HDFS Name Node, Data Nodes.
  • Experience working with Hadoop in stand-alone, pseudo and distributed modes.
  • Extensive experience in designing and developing enterprise applications for JEE platform using Java 1.5, 1.6, J2EE, Portlets, Spring, Hibernate, Struts, JSF, EJB, JavaScript, JSON, Angular JS, XML, Web Services, JAXP, JAXB.
  • Excellent experience in ETL analysis, designing, developing, testing and implementing the ETL processes from Stellent Content Manager and IBM Content Manager.
  • Proficient in Data Modeling concepts and expertise in writing complex SQL queries, Store Procedures for major Relational Databases like DB2, Oracle and My SQL.
  • Designed and Developed web based enterprise applications on multi clustered environments like IBM WebSphere 7.x & WebLogic Application Servers and Tomcat web servers.
  • Contributed immensely to the design and development of MVC and ORM framework based applications using JSF, Spring MVC, Struts, Hibernate and iBatis.
  • Excellent work experience in SOA development using SOAP and RESTful Web Services.
  • Implemented J2EE applications on UNIX platform and hands on experience in creating shell scripts on AIX/UNIX for the Cronjob setup and executions.
  • Adopted J2EE best practices by implementing GOF design patterns like MVC, Service Locator, Business Delegate, Front Controller, Session Façade, DAO, DTO and Value Object.
  • Expertise in using multiple IDEs like WID 6.x, RAD 7.x, Eclipse and proficient in configuration tools like Microsoft VSS, CVS, Rational Clear Case, SVN (Sub Version), Harvest.
  • Worked in software development methodologies like Agile/SCRUM and Waterfall.
  • A passionate software professional with the ability to work under pressure and manage (Multiple) projects with changing priorities and tight deadlines and can work either independently or as a part of a team.

TECHNICAL SKILLS:

Big Data Ecosystems: Hadoop, HDFS, Spark Batch, Spark Streaming, MapReduce, YARN, Hive, Pig, Sqoop, Oozie, HBase, Avro, Flume

Java Technologies & Framework: Java/J2SE 1.6, 1.7, J2EE, Servlets/JSP, Java Server Faces (JSF), Spring 2.x, 3.0, Hibernate, iBatis, Struts 1.x,2.0, Log4j, JUnit, Xstream 1.2

Web Technologies: JSR 168 Portlets, AJAX, JavaScript, HTML, XML, XSLT, JQuery, DOJO

Messaging Brokers: Kafka 1.0, IBM Message Broker v7, Message Queue v7

Web Services: SOAP/Big Web Services, JAX-WS, JAX RPC, WSDL, UDDI, XML, XSD, XSL

Languages: Java/JSE 1.6, 1.7, SQL, PL/SQL

IDE Tools: Eclipse 3.1,RAD 7.4, WID 6.x, RSA 7.0

Databases: NoSQL (HBase), Oracle 11, IBM DB2 and MySQL

Design Methodologies: UML, SCRUM

Version Control Tools: CVS, Clear Case, Subversion (SVN), CCC/Harvest

Operating Systems: Windows XP/2000/NT/98, UNIX, AIX

O/R Mapping: Hibernate 3.0, JPA, iBatis

PROFESSIONAL EXPERIENCE:

Confidential, Roseland, New Jersey

Sr. Big Data Developer

Responsibilities:

  • Designed and implemented Confidential 'S Next Generation Real time data processing framework using Hadoop Eco system tools like Spark streaming and Golden gate, Kafka, Java, JPA.
  • Designed and developed Spark streaming framework to stream millions of messages from the Golden Gate-to-Kafka servers to the target Data Base.
  • Worked closely with Dev ops team to configure Kafka brokers, topics, partitions, replication factor for various applications like WFN, ETIME and VANTAGE.
  • Utilized Spark Streaming to extract and process data by parsing RDDs with transformations and actions.
  • Developed Java based Joblets that are responsible to handle the data transformation of the Kafka messages before loading to the target data base.
  • Responsible for creating and developing the source to target data sources mapping, creating JPA entities and loading the data.
  • Designed the architecture of the configuration based ETL processing from various data sources.
  • Developed XML based configuration framework to process the data using XML, JAXB, Java and Spark streaming.
  • Designed and developed Monitoring tool and scheduled respective jobs to validate data streaming.
  • Optimize data processing performance and fine-tune the application for better and accurate ETL processing to achieve 100% matching results between source and target.
  • Co-ordinate with Data team to integrate stored procedures with the Spark streaming data processing.
  • Organize meetings and coordinate with business owners to fetch the requirements for the ETL data processing.
  • Provide technical support and consultation for Java Applications and Infrastructure questions.
  • Create Excel bases test cases to unit test the data transformation and loading process.
  • Pulled stream data from Apache Kafka brokers to consumers using Spark Streaming.

Environment: Spark 1.6.2, Spark Streaming, Kafka 1.0, Java/J2EE 1.7, JPA Eclipse link, Oracle Golden Gate 11, Oracle 11, Kafka tool, Scala 2.11.x, JUnit, Agile, Confluence, GIT Hub, Gradle, Rally

Confidential - New York City, NY

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data analytic solutions using Hadoop.
  • Created Hive scripts to store data and performed analysis on data using Hive queries.
  • Created Hive Internal and External tables to load data from DB2 database.
  • Implemented Partitions and Buckets in Hive for efficient data access.
  • Developed MapReduce jobs to process the master Avro data set as per the BI requirement.
  • Implemented Avro serialization to process the raw data for generating the master Avro file.
  • Written Pig Latin scripts to filter and transform the raw data for data analytics.
  • Implemented Side data distribution using Distributed Caching mechanism in MapReduce job.
  • Designed and developed custom filters, Hadoop writable types and counters as part of the MapReduce jobs.
  • Designed and implemented Oozie workflow and coordinator jobs to automate different data analytics.
  • Written shell scripts for execution of file ingestion & Oozie coordinator jobs.
  • Responsible for Sqoop import of historical data from IBM AS400 DB2 database to HDFS.
  • Exported the analyzed HDFS data to SQL Server Database using Sqoop for business reports.
  • Created MapReduce custom partitioners and in-mapper combiners for performance tune up.
  • Developed MapReduce jobs for analyzing application logs, IVR logs and historical data.
  • Defect analysis and fixing in all environments, including Production.
  • Trained and guided the UPS (Database) team on Hadoop framework, HDFS and MapReduce concepts.

Environment: Java, Hadoop CDH4, HDFS, MapReduce, Avro, Oozie, Sqoop, Hive, IBM AS400 DB2, SQL Server, UNIX, Shell scripting

Confidential, Parsippany, NJ

Hadoop Developer

Responsibilities:

  • Used Flume to collect data sets from various sources like Twitter, Wikipedia, etc.
  • Worked with Big Data Analysts and architects in troubleshooting issues with Sqoop, Hive, Pig, and Flume.
  • Involved in importing data to HDFS from DB2 and Oracle database using Apache Sqoop.
  • Written Pig scripts to filter the raw data preparing to be used for the analysis.
  • Developed MapReduce jobs to preprocess and analyze the data sets.
  • Designed proof of concept on Hadoop MapReduce, Hive and Pig; demonstrated the same to the Confidential team.
  • Worked on the proof-of-concept for Apache Hadoop framework initiation on Amazon Web Services (AWS).
  • Created Hive internal and external tables to load data from DB2 database.
  • Create and execute the Hive queries for data analytics as per the business requirement.
  • Used Oozie job scheduler to automate the job flows.
  • Defect analysis and fixing in all the environments (Dev and Prod)
  • Served as a primary technical liaison with the vendors and facilitate support and troubleshooting.

Environment: Java, AWS, MapReduce, Hive, Pig, Sqoop, DB2, Cloudera Manager, Flume, Oozie.

Confidential, Parsippany, NJ

Java Lead Developer

Responsibilities:

  • Lead a team of 4 members while handling the following responsibilities:
  • Developed bottom up SOAP Web Services, exposed EJB Session Beans as web services to Confidential for Confidential check processing.
  • Designed and developed the Spring Batch module to process the Confidential checks and transfer of the checks metadata to the Confidential .
  • Performed POC and successfully created wrapper implementations to expose the Struts action classes as JAXWS SOAP Web Services (Axis WS).
  • Designed and developed ETL multi-threading process to extract commercial documents from IBM Content Management server and transferred (SFTP) the same to TYCO team.
  • Involved in setting up the SOAP web services security with VeriSign to configure the SSL, certificate, public-key, private-key (key/trust store) using IBM iKeyman tool.
  • Single point of contact for all the Confidential - Confidential check processing related issues.
  • Created the object-OR mapping for a complex table structure that has composite primary key fields.
  • Created end to end OR mapping and DAO-DAO Implementation to access/retrieve the data using Named Query, Native Query in Hibernate JPA.
  • Implemented Hibernate OR mapping for the composite primary key using EmbeddedId and Embeddable annotations.
  • Implemented Stored Procedures accessing/invocation in Hibernate JPA.
  • Implemented Spring core inversion of control to inject the client objects needed to interact with content management server and service layer.
  • Implemented Spring AOP for capturing the performance measurements in the application.
  • Involved in DB2 to Oracle migration project, made required changes in SQL queries that are DB2 incompatible with Oracle.
  • Created UNIX shell scripts to invoke the services code for generating the reports.
  • For Document Management application: Implemented MVC architecture by designing and developing JSR 168/286 portlets in combination with Java Server Faces (JSF).
  • Designed and developed the GUI components using JSR 168 Portlets, JSF, JSP and JavaScript in DM application.
  • Developed multiple portlet modes such as VIEW, EDIT, etc. as per the business requirements.
  • Implemented inter portlet communication (IPC) to exchange the properties between multiple portlets.
  • Developed individual module to access online Confidential timely reports using DOJO Enhanced Grid with Struts 2 returning JSON data.
  • Developed client programs to interact with DB2 Content Management system from portal.
  • Worked closely with admin team to configure portal server with LDAP and external security manager.

Environment: Java 1.5, J2EE, JSR168 Portlets, JSF, JQuery, WebSphere Portal 7.0 and Process Server, DB2 migrated to Oracle, Spring, Hibernate, XML, Web Services; Multiscreen Application: Struts 2, REST web services, DOJO, JSON, Oracle.

Confidential, South Lake, TX

Senior Programmer Analyst

Responsibilities:

  • Responsible to manage/lead the off shore team of 5 members.
  • Handle daily status calls with FIC, India - delegate the new tasks and guide them in solving the critical issues.
  • Designed and developed the Pagination scenario in the application.
  • Responsible for implementing the Junit test cases using Jmock, EasyMock and Spring mock objects to mock the service layer objects.
  • Responsible for configuration of Data source and JNDI in the Web Sphere server.
  • Developed web services to interface business layer and responsible for integrating the presentation layer with the web services.
  • Developed business layer interfacing Oracle as database. iBatis framework is used for object relational mapping and persistence.
  • Design and develop Portlets using JSR168 API and Java server faces (JSF) to interface customers the Auto Funds Management system.
  • Implemented Exception handling mechanism in the persistence layer.
  • Deployed the applications in development, QA and staging environments.
  • Troubleshoot production problems, analyze/gather application logs
  • Production implementation and rollout
  • Managing the activities as per schedule and resolving issue on timely manner

Environment::: Java 1.4, J2EE, JSR168 Portlets, JSF, WebSphere portal, iBatis, Spring, Oracle, XML, Web Services, SOAP UI.

Confidential

Associate Technology L2

Responsibilities:

  • Work closely with the architecture team to analyze requirements and evaluate technical feasibility to develop the application
  • Involved in the implementation and support for the CMA Web and Migration modules.
  • Implemented the features of the Spring Core module.
  • Used Object/Relational mapping tool Hibernate to achieve object persistency.
  • Involved in configuring hibernate to access and retrieve data from the database.
  • Responsible for creating complex SQL queries on the SQL Developer tool for the analysis of the issues during the data migration.
  • Developed Java programs as part of the ETL process from Stellent Content Management server.
  • Automated the process of generating the XML files from the Java objects that represents the underlying data base schemas using the J2SE1.4 IOStreams, and XStream 1.2 framework.
  • Understanding the requirements, developing, enhancements and fixing the bugs.
  • In the CMA Web module, developed Servlets as Controller gateway and other helper classes were implemented to limit the business logic in the Servlets and to interact with the database.
  • Responsible to develop the presentation layer using JSP, JSTL, and HTML.
  • Analyzed and fixed the issues raised during the UAT (User Acceptance test).
  • Responsible for solving the defects raised by business users in Production

Environment::: Java 1.4, J2EE, Hibernate, Oracle 8i, WebLogic, XML, XStream 1.2, Subversion, SQL Developer, Stellent Content Management Server.

Confidential, New York, NY

Java Consultant

Responsibilities:

  • Requirement gathering and analysis of existing system
  • Used Object/Relational mapping tool Hibernate to achieve object persistency.
  • Responsible for developing/usage of Criteria objects in Hibernate to implement the dynamic searches.
  • Involved in overseas client interaction in the ‘Centra’ Meeting to discuss the status of the module, clarify any queries regarding the functionality, bug fixes etc.
  • Developed application business logic as per the business rules in the specification and implemented DAO to interact with the database.
  • Analysis of the defects and provide possible means to prevent those in future.
  • Responsible for weekly Builds and release notes and responsible for sending the same to release management for production team.
  • Interacted with Business Analyst for requirements gathering.

Environment:: Java, Struts, Oracle, Hibernate, Rational Clear Case, WebLogic application server

Confidential

Java Developer

Responsibilities:

  • Involved in Requirement gathering and analysis to setup the FileNet environment at client site
  • Designed and developed the User Interface using JSP and developed various Action classes and Form bean classes using Struts framework.
  • Validated the requirements for the application in the client side and ActionForm classes using the struts validation framework.
  • Implemented client side validations using JavaScript and Struts Validation Framework
  • Developed the controller components, the Action classes in Struts.
  • Developed the business logic as per the business rules in the specification.

Environment:: Java 2, Struts, Servlets, JSP, JDBC, Oracle, WebLogic 6.x, Eclipse, Microsoft VSS. Big Data Hadoop Developer and Senior Java Developer

We'd love your feedback!