Hadoop Developer/java Developer Resume
Dublin, OH
SUMMARY
- 9 years of IT industry experience in consulting organizations. Worked in number of industry sectors including Insurance and Health Care primarily in Big Dataecosystem (4.5 years) & Mainframe technologies (4.5 years).
- Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node andMapReduceprogramming paradigm. Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, R, HDFS, HBase, Hive, Sqoop, Pig, Marklogic, Zookeeper and Flume, RabbitMq, Gradle, Protobuf.
- Good Exposure on Apache Hadoop Map Reduce programming, PIG Scripting and Distribute Application and HDFS.
- Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
- Used Lambda Architecture with Apache storm and Spark in integration with Hadoop for date analysis.
- Experience in managing and reviewing Hadoop log files.
- Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, Cassandra, SOLR/Lucene.
- Implemented in setting up standards and processes for Hadoop based application design and implementation.
- Involved in all phases of Software Development Life Cycle (SDLC) and Worked on all activities related to the development, implementation, administration and support of ETL processes for large - scale Data Warehouses.
- Experience in working with all phases of Software Development Life Cycle (SDLC) and expert working in Waterfall and AGILE methodologies.
- Experience in ETL methodology for supporting Data Extraction, Transformations and Loading process.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Experience in managing Hadoop clusters using Cloudera Manager tool.
- Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
- Well experienced in development and Maintenance projects using COBOL II, JCL, VSAM/DB2 and CICS.
- Having profound Knowledge and experience in databases like DB2, IMSDB, DATACOMM, SQL Server 2008 R2.
- Well equipped to work with DB2 Stored Procedures, CHANGEMAN/Endeavor, Xpeditor, SPUFI, QMF, DATAQUERY, Abend-Aid, File-AID & File-Manager, Microsoft Visio Utilities, HP Quality Centre, Microsoft TFS.
- Sound experience in Selenium Webdriver using C# with NUnit as Framework and also Selenium RC using Java with Eclipse as Framework in automation testing for web applications.
- Developed tools in REXX for monotonous and laborious activities in the projects worked to reduce manual intervention for Keying data and validating results of production files with huge data.
- Experience of successfully delivering complex Development and Enhancements and Production Support Projects and programs with geographically distributed teams. Delivery of multiple con-current projects and budgets of up to $6+ million.
- Ability to work in high-pressure environments delivering to and managing stakeholder expectations
- Application of structured methods to: Project Scoping and Planning, Monitoring and Controlling cross track dependencies, risks, issues, schedules and deliverables.
- Experience of working with multiple vendors and,managing delivery in matrix organization structures working with Development teams, Testing teams, IT Infrastructure Support teams and Operations Support teams.
- Extensive experience in maintaining applications catering to business domain that includes Eligibility Enrolment, Claims processing of Health Care (Medicare, Medicaid, US healthcare) and Loan processing in Banking and Financial services
TECHNICAL SKILLS
Technology: Hadoop Ecosystem /Mainframes / Open system / Data base
Operating Systems: OS 390, Z/OS, VAX/VMS, Windows Vista/XP/NT/2000/ LINUX (Ubuntu, Cent OS), UNIX
DBMS/Databases: DB2, IMS DB, CA-DATACOM, VSAM, MS-SQL Server
Programming Languages: COBOL, JCL, REXX, EASYTRIEVE, SAS, C, C++, Core Java, XML, .NET MVC,R
Transaction Management: CICS 3.0, IMS DC
Methodologies: Agile, Water Fall
Big Data Ecosystem: HDFS, HBase, HadoopMapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie, Cassandra, Datameter, Pentaho
Middleware: MQ SERIES
Configuration Management Tools: Visual Source Safe, PVCS, Team Foundation Server (TFS).
Tools: Configuration Management: ENDEVOR 3.9 & 4.0, CHANGEMAN, Visual Source safe
Debugging: Expeditor (Batch & Online)
Online Replication: IBM Queue Replication
Spooling: SDSF
Db2 & File Management: FileAid, FileAid for Db2, QMF, SPUFI
DB2 DBA tools: OMEGAMON, Strub
MQ related: Appwatch for MQ Series, Tivoli MQ Browser
Scheduling: Control-M, CA-7/11
Archiving tools: RMDS, SAR
Project Management Software: MS Project, Clarity, Plan View, IBM Rational ClearQuest- 7.1.1 PROD-PROD, IAS Work Tracker
CASE Tools/ Designing Tools: MS Visio, MEGA (6.0)
Testing tool: QC, QTP, Selenium RC, NUnit, Eclipse, SOAP UI, Fiddler
Project/estimation: Complexity Point, Functional Point
PROFESSIONAL EXPERIENCE
Confidential, Dublin, OH
Hadoop Developer/Java Developer
Responsibilities:
- Involved in start to end process of Hadoop cluster installation, configuration and monitoring
- Responsible for building scalable distributed data solutions using Hadoop
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
- Setup and benchmarked Hadoop/HBase clusters for internal use
- Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
- Developed Simple to complex Map/reduce Jobs using Hive and Pig
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop, Cassandra, Mongo.
- Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
- Used UDF's to implement business logic in Hadoop
- Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other Sources
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
- Cluster coordination services through Zookeeper.
- Installed Oozieworkflow engine to run multiple Hive and Pig jobs
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
- Access MarkLogic text, geospatial, value, and document structure indexes to send only the most relevant data to Hadoop for processing.
- Send Hadoop Reduce results to multiple MarkLogic forests in parallel.
- Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
- Map Reduce Programming on Hadoop Distributed File System (HDFS).
- Developing R Scripts for statistical models on data, translate the derived models into colorful graphs and visualizations, and do a lot more functions related to data science.
- Developing Pig Scripts to pull data from HDFS.
- Developing Java APIs for invocation in Pig Scripts to solve complex problems.
- Writing Shell, Perl and Python scripts to automate and provide Control flow to Pig scripts.
- Understanding complete working of pre-existing product and its whole functionality.
- Loaded the data in to Hadoop for processing the unstructured data in text files.
- Load the data in the Apache Solr for searching purpose.
- Loaded the data in to Hadoop for processing the json, text files.
- Worked on loading techniques using sqoop, Hive.
- Worked on some compression techniques such as Avro.
- Used oozie work flow for exporting HBase data.
- Developed data pipeline using Pig and Hive from Teradata and Netezza data sources. These pipelines had customized UDF’S to extend the ETL functionality.
Confidential, Orange, CA
Java Developer/ Hadoop Developer
Responsibilities:
- Map Reduce Programming on Hadoop Distributed File System (HDFS)
- Developing Pig Scripts to pull data from HDFS.
- Developing Java APIs for invocation in Pig Scripts to solve complex problems.
- Writing Shell, Perl and Python scripts to automate and provide Control flow to Pig scripts.
- Understanding complete working of pre-existing product and its whole functionality.
- Loaded the data in to Hadoop for processing the unstructured data in text files
- Loaded the data in to Hadoop for processing the json, text files
- Worked on loading techniques using sqoop, hive
- Worked on some compression techniques such as Avro
- Used oozie work flow for exporting HBase data
- Analysis of requirements
- Preparing Detail designs for Development.
- Documenting the Functional and Technical specifications.
- Maintaining quality of the project as per the standards
- Synchronized different configuration files of Hadoop Cluster nodes across the system.
- Moved the system logfiles out of the Hadoop directory to make sure all the logfiles are in one place even after the installation directory changes due to an upgrade.
- Customized the SSH settings in the Master node.
- Configured the MapReduce property to make sure local temporary storage is using large disk partitions.
- Load the data in the Apache Solr for searching purpose.
- Created User accounts and given the users the access to the Hadoop Cluster
- Implemented the secure authentication for the Hadoop Cluster using Kerberos Authentication protocol.
Confidential, New York, NY
Java Developer
Responsibilities:
- Analysis and understanding of business requirements.
- Developed application using Spring MVC, JSP, JSTL and AJAX on the presentation layer, the business layer is built using spring and the persistent layer uses Hibernate 3.0.
- Developed views and controllers for client and manager modules using Spring MVC 3.0 and Spring Core 3.0.
- Business logic is implemented using Spring Core 3.0 and Hibernate 3.0.
- Data Operations are performed using Spring ORM wiring with Hibernate and Implemented Hibernate Template and criteria API for Querying database.
- Developed Exception handling framework and used log4J for logging.
- Developed Web Services using XML messages that use SOAP. Developed Web Services for Payment Transaction and Payment Release.
- Developed Restful web Services
- Created WSDL and the SOAP envelope.
- Developed and modified database objects as per the requirements.
- Involved in Unit integration, bug fixing, acceptance testing with test cases, Code reviews.
Confidential, Seattle, WA
Team Lead & System Analyst
Responsibilities:
- Understanding the impact of the change happening to the system i.e., doing Impact Analysis.
- Analysis, Coding, Preparing the Test Conditions, Test Cases and Test Data for the System testing, Regression testing, Integration testing.
- Working and interacting with the onsite team (Seattle) and also with the offshore support, development team.
- Preparing technical specification documents
- Coding for the new programs and enhancing existing Programs.
- Preparing and executing Unit test cases
- Doing functional and technical reviews
- Delegation of work to the team, monitoring the load.
- Support to the testing team for System testing/Integration/UAT
- Assuring quality in the deliverables.
- KT sessions on Functionality of the System to new joiners and also about the processes to be followed for the Project.
- Analyzing the requirement documents and converting them to the technical specifications.
- Designing the webservices as per the technical specifications.
- Used Core Java to implement the business functionalities.
- Used JAX-WS webservices to expose the functionality to the presentation layer.
- Used hibernate and wrote various sqls for querying the data from database.
- Used Spring framework to inject the DAO and Bean objects.
- Oracle SQL Developer and Toad are used for the database querying.
- Used JMS Messaging Service for sending messages.
- Junit test cases were written at each layer to test the functionality.
- Enhancing the existing functionality to improve performance and bug fixing.
- Understanding of business requirements and identify the relevant functional and technical requirements.
- Understanding of frame work and system flow.
- Developing prototype and providing functionality.
- Documenting the Functional and Technical specifications.
- Involved in developing various screens using JSP and BSF framework.
- Developed JSPs and Servlets for carrying the user inputs and accessing the corresponding EJB.
- Handled complete design and implementation of the EJBs (middle tier), developing session beans.
- Involved in developing Unit Test cases and testing the Application.
- Used Log4j for External Configuration Files and debugging.
Confidential
Developer
Responsibilities:
- Understanding the business and functional requirements and the technology of the client system.
- Analysis, Preparing the Test Conditions, Test Cases and Test Data for the System testing, Regression testing, Integration testing as well as Deployment checkout.
- Working and interacting with the onsite team and also with the offshore support, development teams.
- Preparing and executing Unit test cases
- Support to the testing team for System testing
- Deploy the code in production and support any post-deployment issues
- Assuring quality in the deliverables
- Involved in Analysis for code changes, copybook changes, JCL and Control Card changes.
- Coding new programs using COBOL/CICS/VSAM and COBOL/CICS/DB2.
- Involved in gathering system requirements for different internal projects in the system.
- Worked on Developing JSP pages
- Implemented Struts Framework
- Developed Business Logic using Java/J2EE
- Modified Stored Procedures in Oracle Database
- Involved in design discussions and understanding of business requirements and identify the relevant functional and technical requirements.
- Developed UML use cases and class diagrams.
- Understanding of frame work and system flow.
- Developed prototype and providing functionality using mock up screens.
- Documenting the Functional and Technical specifications.
- Developed JSP based on the requirements.
- Coded Struts Action Servlets to process requests and forward to Struts Action classes
- Developed mappings in Struts-Config.xml
- Participate in ICD discussions and updated wsdl’s and xsd’s according to the requirement.
- Used JAX WS for developing web services and clients.
- Written webservices client and server for achieving the functionality
- Involved in coding new api’s for cssop interface as per the PCR’s.
- Coded various classes for Business Logic Implementation.
- Develop and test the code to migrate existing customers to RIV program.
- Enhancing the existing system with the new requirements and fixing bugs if any.
- Used various PL/SQL statements to store and retrieve the data from database.
- Used SQL*Loader to upload data into the Oracle Database.
- Developed shell script to retrieve data and generated reports and automated to send these to management team to look at Statistics on daily basis.
- Involved in developing Unit Test cases and testing the Application.
- Used Log4j for External Configuration Files and debugging.
- Supported testing team and solved defects raised at various stages until production.
- Involved in the complete life cycle of the project from the requirements to the production support.