Hadoop Developer Resume
Irving, TX
SUMMARY:
- Over 7 years of experience in Java and Big Data technologies in complete software development life cycle (SDLC) of n - tier large web-based distributed systems
- Experienced with the entire Software Development Lifecycle (SDLC) of applications: gathering requirements, analysis, conceptual and detail design, development, verification and testing.
- Experience in designing, configuring and managing network and storage infrastructure.
- Experience in analyzing and recommending the Big Data solutions.
- Good Knowledge on various Hadoop Distributions like Cloudera, Apache
- Experience in installation, configuration and management of Hadoop Clusters Experience in managing the Hadoop infrastructure with Cloudera Manager Experience in understanding and managing Hadoop Log Files.
- Experience in providing security for Hadoop Cluster with Kerberos.
- Experience in analyzing data in HDFS through Hive, Pig, DMX-h
- Experience in bi-directional data pipelines from HDFS to Relational Database with Sqoop.
- Experience in setting up the High-Availability Hadoop Clusters.
- Flexible with Unix/Linux and Windows Environments.
- Experience in working with Operating Systems like Centos 5/6, Ubuntu 10/11 and Sun Solaris Experience in developing Shell Scripts for system management and Hadoop applications. Experience in working various RDMS like Oracle SQL Developer, PostgreSQL, MySQL and Teradata
- Expertise in implementation of MVC using frameworks such as Struts and Spring.
- Motivated team player with ability to work under minimal supervision possessing excellent communication, interpersonal, analytical and problem solving skills.
PROFESSIONAL SUMMARY:
Confidential, Irving, TX
Hadoop Developer
Responsibilities:
- Analyzing the requirement(s).
- Prepare the effort estimation for the activity based on the requirement and design document Analyzing the Business Requirements and System Specification to understand the application.
- Participated in brainstorming sessions on finalizing the data ingestion requirements and design.
- Experience in configuring Hadoop components including Hive, Pig, HBase, Spark,Sqoop,Oozie and Hue in the client environment.
- Designed and developed a process to do incremental import of raw data from DB2 into Hive tables using Sqoop.
- Experience in managing data coming from different sources and involved in HDFS maintenance and loading of structured and unstructured data.
- Perform Data Ingestion Activities using Solix.
- Involved in converting HIVE/SQL queries into spark transformations using Spark RDDs.
- Performed various levels of data curation(L1-L5) takes places within EAP.
- Developed dashboards using Arcadia Data.
- Collect the metrics from various tools like RevoR, Datameer, Platfora, Yarn and Spark.
- Performed data cleaning using Paxata and place the final data in Hive tables.
- Involved in creating POCs to ingest the data using Spark and HDFS.
- Experienced in monitoring cluster using Cloudera Manager. Involved in Unit testing.
Environment: Hadoop, Hive, Spark SQL, Arcadia Data, JIRA, MYSQL workbench, Autosys, Paxata,Hue,Platfora,Unix,Sqoop,Solix.
Confidential, Charlotte, NC
Hadoop Performance Engineer
Responsibilities:
- Gather requirements from end users for the data required for analysis.
- Responsible for Hadoop design, development, testing and performance for the Haas applications.
- Preparation of High Level and Low Level designs from the requirements.
- Analyze the availability of data from various upstream applications.
- Experience in tuning the performance for the Hadoop MR jobs.
- Extract source files from various application databases and transform as per business requirements.
- Experience in developing the UNIX shell scripts to capture the various logs as part of performance metrics for YARN and SPARK.
- Experience in building the hive tables on top of HDFS for user querying purpose.
- Experience in scheduling the jobs through Autosys.
- Experience in performing code migrations.
- Developed automated performance metrics tool
- Experience in executing the queries using Beeline.
- Involved in Hadoop MR1 to YARN migration
- Developed hive scripts for data transformation and aggregation.
- Processing data provisioning using DMX-h using MR in DMExpress tool.
- Experience in handling the small part files and zero byte files.
- Developed the Sqoop commands for exporting and importing data in HDFS.
- Experience in developing the Oozie workflows for launching the jobs.
- Extracted the data from Teradata tables and import that into the HDFS using Sqoop.
- Built the hive tables on top of HDFS for user querying purpose.
- Assisting the application teams on the issues through Nexus requests.
Environment: - Hadoop 2.0,Hive,Beeline,Kafka,Spark 1.6,Impala,Hue,DMX-h,Autosys,Teradata.
Confidential, San Jose,CA
Hadoop Developer/Administrator
Responsibilities:
- Working with sub-version (Linux) tool.
- Developed functional test cases and test scripts for Hadoop environment validation after the upgrade.
- Experience in working various ETL tools like Hive and Pig.
- Experience in writing Java Map Reduce programs and Hive/Pig UDF’s
- Experience in developing Oozie work flows for managing various Hadoop ETL tools.
- Experience in importing and exporting data from/to relational databases and HDFS using Sqoop.
- Experience in working with Impala for ad-hoc querying on Hadoop systems.
- Experience in working with open source databases like My SQL, PostgreSQL.
- Install and configure Linux audit tool for various auditing purposes.
- Developed effective solution for output and error logging for Linux jobs.
- Documenting project design and test plan for various projects landing on Hadoop platform.
- Developed effective solution for output and error logging for Linux jobs.
- Developed a real-time CPU and memory monitoring web interface to overcome Cloudera Manager CPU and memory charts bug.
Environment: - Hadoop 1.0,Sqoop,MySQL,Oozie,Sqoop, Java,MapReduce,Impala,Pig,UNIX
Confidential, Tallahassee. FL
Java Developer
Responsibilities:
- Working on the design and delivery of the project.
- Working on Agile methodology and pair programming to develop web applications.
- Implemented UML diagrams in the design of the project.
- Experience in building scripts via Maven.
- Working on data migration from Microsoft Access to Oracle SQL using ODBC drivers.
- Involving in the code review meetings.
- Frequent Interaction with the client in gathering the requirements and implementing them into the development.
- Working on the scripting languages like jQuery.
- Deploying the application in Oracle web logic application server.
- Using Oracle SQL Developer to create tables, views, trigger and partitions.
Environment: - jQuery 2.0,UML Diagrams, Maven 2.0,Oracle SQL Developer, Oracle Web Logic Sever.
Confidential, Tallahassee, FL
Java Developer
Responsibilities:
- Involved in writing technical design documents from functional design documents.
- Worked in designing, development and delivery of the project.
- Primarily involved in development of screen designs using Struts framework, JSPs and HTML.
- Using Struts in presentation tier and used Spring Core/IOC to wire/inject the object dependencies.
- Used Hibernate framework was used to persist the data on to oracle database.
- Performed validations using Struts validation framework and client side validations using JavaScript.
- Actively involved in code review and was responsible for unit integration/bug fixes.
- Involved in Studying the User requirement. Involved in Installation and configuration of WebSphere 8 Application Server.
- Design and Review the test cases as per the functional Scenarios and incorporating the review comments as per the business.
- Validating the backend functional flows by retrieving the results through SQL developer Involved in defect tracking and defect management by using the IBM Rational Clear quest.
- Involved in Defect triage calls with the developers, Business Analysts and testing team to discuss about the defects summary.
- Co-ordinate with business team & supported to UAT testing.
- Conducted formal & Informal testing with CMS Federal HUB to verify the services working fine.
Confidential, Salt Lake City.UT
Java/J2EE Developer
Responsibilities:
- Worked in designing, development and delivery of the project.
- Worked on the scripting languages like JSP, jQuery.
- Developed the front end application using jQuery, CSS and JavaScript.
- Used ANT and compile to deploy the files.
- Worked on SPRING MVC to create mappings and store the data in to database.
- Handled Source code management using GIT.
- Worked on Creating the Database tables.
- Deployed the application in Web sphere Application Server.
- Involved in code review documents and bug fixing for improving the performance.
Environment: Java, JavaScript, jQuery, CSS 3.0, Spring MVC,ANT, IBM WebSphere Application Server, GIT.
Confidential, Olympia, WA
Java/J2EE Developer
Responsibilities:
- Worked in designing, development and delivery of the project.
- Continuous interact with clients for requirements and specifications understanding.
- Worked on Program Request (PR) and Change Request (CR) of various releases.
- Worked on the scripting languages like JSP, jQuery.
- Used ANT and compile to deploy the files.
- Worked on SPRING MVC to create mappings and store the data in to database.
- Handled Source code management using clearquest.
- Worked on Creating the Database Tables.
- Extensively used DOM and SAX parsers for parsing XML data and XSLT for XML transformations.
- Deployed the application in Websphere Application Server.
- Used Oracle DB2 database with SQL to create tables, views, trigger and partitions.
- Involved in all the major releases.
- Involved in code review documents and bug fixing for improving the performance.
- Developed the front end application using jQuery and JavaScript.
Environment: - Java, jQuery, JSP, JavaScript, XML, IBM WebSphere application Server, Spring MVC,ANT.DB2.
Confidential
Java/J2EE Developer
Responsibilities:
- Worked in designing, development and delivery of the project.
- Continuous interact with security and exchange commission’s clients for requirements and specifications understanding.
- Worked on Program Change Request (PCR) and Software Provider Requests (SPR) of various releases.
- Created managed beans and model for Struts framework.
- Worked on the scripting languages like JSP, jQuery.
- Used ANT and compile to deploy the files.
- Used Spring-JDBC with Hibernate to provide abstraction layer to DAOs.
- Worked on Hibernate to create mappings and store the data in to database.
- Worked on HTML JUnit test cases on different modules in the project.
- Handled Source code management using Serena VM.
- Worked on Creating the Database Tables.
- Deployed the application in Tomcat application server.
- Involved in all the major releases.
- Involved in code review documents and bug fixing for improving the performance.
- Developed the front end application using jQuery and JavaScript.
- Manipulated DB2 for data retrieving and storing using Hibernate.
- Involved in creating the style sheets.
Environment: - Java 1.6,JSP,jQuery,Tomcat 7.0,Javascript,Hibernate,Spring-JDBC,Struts 2.0,ANT.
Confidential, Atlanta, GA
Java/J2EE Developer
Responsibilities:
- Worked in designing, development and delivery of the project.
- Continuous interact with Fedex clients for requirements and specifications understanding.
- Worked on Quality Controls in L2 environment.
- Created managed beans and model for JSF framework.
- Worked on servlet programming and JSP scripting for the communication between web browser and server.
- Worked on test cases on different modules in the project.
- Worked on scripting languages like Ajax and HTML
- Involved in code review documents and bug fixing for improving the performance.
- Used Oracle 11g database with SQL to create tables, views, trigger and partitions.
- Implemented application logging features using log4j.
- Deployed the application in JBoss Application server.
- Extensively used JBoss while writing code and for creating data sources
- Used SVN for source control and created build scripts using ANT.
- Providing the production support to the application and resolved the production issues.
- Testing the application in all the environments like UAT and performance testing.
- Used ANT to compile and deploy the files.
Environment: Java 1.5, JSF 2.0,JBoss, ANT, JSP, HTML, AJAX, Tomcat, Oracle SQL Developer.