Hadoop Developer Resume
SUMMARY
- Good working Knowledge onHadoopCluster architecture.
- Good working experience on Sqoop and SFTP to get the data from relational and non - relational data sources into data lake
- Good working experience on creating transformations to format the data before placing the data in data lake
- Experience in creating and setting up real time data pipe using Kafka to stream the data generate from real time API’s to data lake and NoSql DB for debugging API’s
- Good at Unit testing (MR Unit, Junit, Mockito) and system testing.
- Good at implementing the algorithms as needed in the context.
- Good at developing highly scalable Spark applications using Java.
- Good at programming with Java 8 features like Lambda expressions and streams.
- Good experience in Agile Model and usage of Rally.
- Experience in managing and reviewing Hadoop log files.
- Experience in implementing in setting up standards and processes for Hadoop based application design and implementation.
- Experience in importing and exporting data from HDFS to local file Systems and vice versa.
- Good working knowledge in Hadoop 1.0 and 2.0.
- Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Good working Knowledge on HIVE, Oozie, HBase, MapR DB tables.
- Good working knowledge on Spring MVC and Spring Batch.
- Good at REST based web services development.
- Good working knowledge on APIGEE development for REST services.
- Good Exposure on Apache Hadoop Map Reduce programming using java.
- Good at batch jobs performance improvements.
- Good at using IDE's like Eclipse, Netbeans for both developing and debugging java applications.
- Expert in implementing Java/J2EE technologies for application development in various layers of projects i.e. Requirement gathering, Analysis, Design, Implementation, Unit Testing and Deployment.
- Expertise in designing and development enterprise applications for J2EE platform using MVC, JSP, Servlets, JDBC, Web Services, Hibernate.
- Experience in creating user interfaces using JSP, HTML, XML, and JavaScript.
- Experience in using J2EE Design Patterns like MVC, DAO Pattern, Front Controller, Factory Pattern, for reusing the most efficient and effective strategies for new development.
- Expertise in implementing Spring framework for Dependency Injection, support for the Data Access Object (DAO) pattern and integrated with hibernate.
- Extensive experience in using loggers like Log4j for creating logs of different categories and placing them in a file.
- Good at using code coverage tools like Sonar cube, Cobertura etc.
- Extensive experience in performing reviews like Test reviews and peer reviews for maintaining best coding standards.
- Experience in SQL development.
- Good knowledge in SDLC development.
- Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
- Have basic knowledge on Informatica and participated in development of small scale mappings.
- Good Working Knowledge on Mainframe Technologies which includes Working experience on Cobol, JCL, VSAM, IMS DB/DC, DB2, Expeditor.
TECHNICAL SKILLS
Echo System: Big Data HDFS, HadoopMap Reduce, Pig, Spark with Java 1.8 and Scala, Kafka, Sqoop, SFTP
Data Bases: DB2, MS SQL Server, MySQL, HBase, Mapr DB
Programming Languages: Java, Cobol, Scala
Tools: OOZIE, MR Unit, Junit, Sonar Cube, Cobertura, Mockito, Power Mock, APIGEE, SQLYog, Sonar Cube, Cobertura, Mockito, Power Mock, expediter, Endeavour
ETL: Informatica
Software Products: Putty, File Zilla, WinSCP, Eclipse, MS Project 2003, Jmeeter, Git Extention,HP Quality Center
Version Controlling Tools: SVN, Git.
Build Tools: Jenkins, Maven.
PROFESSIONAL EXPERIENCE
Hadoop Developer
Confidential
Responsibilities:
- Participating in sprint planning meeting to understand the user stories.
- Good at developing KNN and lightning KNN models using Java
- Break down user stories in to tasks.
- Preparing estimates for each task listed under all the user stories for a sprint.
- Promptly updating task status in Rally, and attending standup calls every day.
- Implementing algorithms like Dernary Search, jaro winkler etc.
- Developing the Map reduce programs using java language.
- Writing Hive queries to refine and filter the data from source systems.
- Optimizing Hive queries to run quickly.
- Write Shell scripts to run the hive queries and for pre and post work of the use cases flow.
- Unit test Map reduce jobs with MR Unit and test hive queries with hive runner.
- Coding spring batch steps to run both map reduce and Hive steps in sequence.
- Coding multi threaded applications to run Hive queries.
- Supporting QA team to complete System testing.
- Debugging and identifying issues reported by QA with the Hadoop jobs in test environments.
- Copying the sample production data generated for limited period of time to test environments.
- Identify all the areas having scope for performance improvements and implement them accordingly.
- Developing REST services to serve real time calls to the big data tables.
- Developing data layer using Hbase and MapR DB tables.
- Writing Junit test cases for REST service using Junit and Mockito.
- Implementing a process to do performance testing for REST services using Jmeter.
- Providing post-implementation, enhancement and maintenance support to client for application
- Facilitated functional and technical knowledge transfer sessions.
- Experience in managing and reviewingHadooplog files.
Environment: Languages: Java 1.6, Java 1.7, Java Script web technologies: REST API, Json, XML, APIGEE, Java Script
Operating System: Windows family, Unix
Data bases: Oracle, MySQL, HBASE, MapR DB, HIVE, HDFS
Other Tools: Service Manager, FTP, Spring Batch, GIT, SVN, Maven, Soap UI, PostMan, Eclipse, Shell Script, Cron tab, Map R distribution, Oozie, log4j, MR Unit, Junit, Mockito, Tomcat Server 7x, Kafka, Sqoop, SFTP
Hadoop Developer
Confidential
Responsibilities:
- Identify and evaluate new technologies for implementation
- Working closely with business team to derive data flow in to newly constructing enterprise data warehouse
- Interacting with all the upstream partners to understand data layout from each to design the enterprise standard schema.
- Involving in hardware configuration decisions for newly constructing enterprise data warehouse
- Working on proof of concepts prior to actual implementation/design of data ware house
- Prepare high level design document and low level design document of the software application implemented
- Perform code optimization during performance tuning to improve usability of the application
- Review technical documentation to ensure technical accuracy, compliance or completeness to mitigate risks
- Perform software modeling and simulation
- Present ideas and deliverables during walkthroughs.
- Review technical documentation to ensure technical accuracy, compliance or completeness to mitigate risk
Environment: Languages: Java 1.6Operating System Windows family, Unix
Data bases: MySQL, HBASE, HIVE, HDFS
Other Tools: HP Service Manager, FTP, SVN, Maven, REST Client, Eclipse, Shell Script, Cron tab, Horton Works distribution, Oozie, log4j, Attunity, Sqlyog, MR Unit, Junit
Hadoop Developer
Confidential
Responsibilities:
- Understanding the business requirements by participated in the Business requirements review meetings conducted by SME's and System Architects.
- Preparing understanding document by including the design plan for the set of Hadoop jobs required to full fill each business requirement by reviewing both business requirement documents and the functional requirement documents.
- Preparing the estimates for the deliverables based on the understanding document.
- Participating in the design phase to prepare low level design documents by working closely with the business team.
- Reviewing low level design documents with the SME's and System Architects to make sure all are on the same page before starting development phase.
- Developing the Map reduce programs using java language and Hive queries as specified in technical spec documents.
- Preparing test data for the set of Hadoop jobs, by copying sample production data to local file system and copy the same data in to HDFS file system in unit region
- Completion of unit testing for the new Hadoop jobs on standalone cluster designated for Unit region using MR Unit.
- Supporting QA team to complete System testing, when they encounter any road blocks for testing.
- Debugging and identifying issues reported by QA with the Hadoop jobs by configuring to local file system.
- Providing post-implementation, enhancement and maintenance support to client for application
- Setup and benchmarkedHadoop clusters for internal use.
Environment: Languages: Java 1.6, Java Script
Operating System: Windows family, Unix
Data bases: HIVE, HDFS
Other Tools: HP Service Manager, FTP, Oozie, SVN, Maven, Eclipse, Shell Script, Cron tab, Apache distribution, Oozie, log4j, MR Unit, Junit
Java Developer
Confidential
Responsibilities:
- Preparation of estimates for the tasks assigned.
- Preparation of quality documents like Program specification documents, Test Plan and Test Cases by analyzing the ADS (Application Design Specifications) provided by the clients.
- Coding programs using Java as per coding standards after the program specification document approved by designer.
- Coding application with Spring MVC framework.
- Performing unit testing using JUnit, MockIto for the new code developed.
- Debugging the code, to identify the root cause for the issues identified by QA team.
- Supporting QA team to setup data.
- Participated in the peer reviews of the code developed by the other developers.
- Providing post-implementation, enhancement and maintenance support to client for application.
Environment: Languages: Java, Java Script web technologies: servlets, JSP, Hibernate, Spring MVC, HTML
Operating System: Windows family, Unix
Data bases: DB2
Other Tools: HP Service Manager, FTP, GIT, SVN, Maven, REST Client, Eclipse, Net Beans, Shell Script, Junit, Tomcat Sever.
JavaDeveloper
Confidential
Responsibilities:
- Involved in requirement gathering and GUI design framework.
- Involved in the team discussions regarding the modeling, architectural and performance issues.
- Using the UML methodology, developed Use Case Diagrams, Class Diagrams and Sequence Diagrams to represent the dynamic view of the system developed in Visual Paradigm.
- Coding accordance to functional requirement
- Used the JDBC for data retrieval from the database for various inquiries.
- Analyzed defects and fixed problems and Involved in peer review of test results
- Involved in writing JUnit Test Cases.
- Used Hibernate for database and also to manage Transactions.
- Hibernate as an ORM tool for OR mappings and configured hibernate.cfg.xml and hibernate.hbm.xml files to create the connection pool.
- Hibernate Persistence Strategy is used for database operations and Developed Hibernate as the data abstraction to interact with the database.
- Used JavaScript for client side validation.
Environment: Languages: Java, Java Script web technologies: servlets, JSP, Hibernate, Spring MVC, HTML
Operating System: Windows family, Unix
Data bases: DB2
Other Tools: HP Service Manager, FTP, GIT, SVN, Maven, Eclipse, Net Beans, Shell Script, Junit, Tomcat Sever
Mainframe Developer
Confidential
Responsibilities:
- Preparing the estimates for the deliverables based on the Application specification document from the designer.
- Managing offshore team, and make them clearly understand the design before starting the actual development.
- Always be a point of contact for the offshore team and gather all the issues by EOD and setting up the discussion with the business team to get them resolved at the earliest.
- Preparing the Technical specs from the low level design documents by understanding the inputs and out puts of the Mainframe jobs.
- Developing the Cobol programs as specified in technical spec documents.
- Get the confirmations from Designer, If there are any show stoppers identified while construction and update the clarifications to the set of documents prepared earlier as part of earlier phases of SDLC.
- Preparing test data for by copying sample production data to test environments for both Unit testing and QA testing.
- Debugging and identifying issues reported by QA with the business logic.
- Providing post-implementation, enhancement and maintenance support to client for application
- Facilitated technical knowledge transfer sessions.
- Experience in analyzing business critical issues reported by business and QA on time to meet the delivery time lines.
Environment: Cobol, JCL, VSAM, IMS DB/DC, DB2, expeditor, Endeavor.