Hadoop Developer Resume
Indianapolis, IN
SUMMARY
- Over 8 years of diversified experience in Software Design & Development. Expertise in designing and developing Test Scripts in JAVA.
- Experience as Hadoop developer solving business use cases for several clients. Experience in the field of software with expertise in backend applications
TECHNICAL SKILLS
Scripting Language: Java, VBScript, .NET
Hadoop Ecosystem: HDFS, Map Reduce, Pig, Hive, Sqoop, Flume, Zookeeper, HBase, Spark, Kafka
IDE Tools: Eclipse
Operating Systems: MS - DOS, Windows95/98/NT/XP/7, Linux, Unix
Web Technologies: JSP, JDBC, CSS
Databases: Oracle, My SQL
Application/Web Server: Apache Tomcat 4.0, Web Logic, TFS
Functional Testing Tools: Quick Test Pro, Selenium, Load Runner, Quality Center, HPALM, JIRA
PROFESSIONAL EXPERIENCE
Confidential, Indianapolis, IN
Hadoop Developer
Responsibilities:
- Strong Knowledge on Multi Clustered environment and setting up Cloudera HadoopEco-System. Experience in installation, configuration and management of Hadoop Clusters.
- Experience writing Map Reduce Jobs, HIVEQLWork with Enterprise Analytics team and transform analytics requirements into Hadoopcentric technologies Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
- Managed and reviewed Hadoop log files.
- Developed and tested scripts in Python
- Extensive working knowledge of Storm, Spark, Kafka, AWS
- Shared responsibility for administration of Hadoop, Hive and Pig.
- Installed and configured Storm, Spark, Kafka, Solr, Flume, Sqoop, Pig, Hive, HBase on Hadoop clusters.
- Worked on BI intelligence tool like Tableau.
- Managed Hadoop clusters include adding and removing cluster nodes for maintenance and capacity needs.
- Provide detailed reporting of work as required by project status reports.
- Manages project aspects ensuring that it is the gatekeeper to the production environment.
Environment: Windows 7/Linux, Hadoop 2.0, SharePoint 2014, Hadoop, AWS, Spark, Hive, Pig, Map Reduce, Sqoop, Zookeeper, TFS, VS 2015, Putty, MySQL, Cloudera, Agile
Confidential, Folsom, CA
Hadoop Developer
Responsibilities:
- Design and Implementation of ETL process in Hadoop Eco-systems.
- Hands on experience on Hortonworks as well migrating data from Oracle with Sqoop.
- Very good experience with both MapReduce 1 (Job Tracker/Task Tracker) and MapReduce 2 (YARN)
- Implementation of de-duplication process to avoid duplicates in daily load.
- Design and implementation of delta data load systems in Hive, which increased efficiency by more than 60%.
- Worked on AWS to implement hadoop clusters.
- Design and implementation of pattern mining application using Mahout FP-Growth Algorithm.
- Developed several advanced Map Reduce programs as part of functional requirements.
- Workd on BI tool Talend.
- Developed Hive scripts as part of functional requirements.
- Implemented Oozie workflows for Map Reduce, Spark, Kafka, Storm, Hive and Sqoop actions.
- Developed and deployed several web services on the Digital Airline platform for the processed data.
- Good knowledge of using splunk, Zookeeper, IDQ
- Successfully integrated Hive tables with MySQL database.
- Experience working on NoSQL databases including Hbase.
- Experience in deployment of code changes using team city build.
- Involved in handling code fixes during production release.
- Implemented Hadoop cluster on Ubuntu Linux.
Environment: Windows 7/Linux, Hadoop 2.0, SharePoint 2013, Eclipse, Hadoop, Hive, Pig, AWS, Map Reduce, Sqoop, Hbase, Zookeeper, HPALM, Putty, MySQL, Cloudera, Agile
Confidential, Houston Tx
Hadoop Developer
Responsibilities:
- Worked on Cloudera to search/analyze real time data.
- Responsible for building scalable distributed data solutions using Hadoop
- Extensive experience in writing Pig scripts to transform raw data from several data sources in to forming baseline data.
- Developed several advanced Map Reduce programs to process data files received.
- Developed Pig Scripts, Pig UDFs and Hive Scripts, Hive UDFs to load data files into Hadoop.
- Continuous monitoring and managing the Hadoopcluster through Cloudera Manager.
- Extracted feeds form social media sites such as Twitter
- Developed Hive scripts for end user / analyst requirements for adhoc analysis
- Involved in loading data from UNIX file system to HDFS.
- Involved in gathering business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
- Very good understanding of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive for optimized performance
- Usage of Sqoop to import data into HDFS from MySQL database and vice-versa.
- Bulk loaded data into HBase using NOSQL.
- Developed Java programs to apply verbatim cleaning rules for responses.
- Experience in storing and retrieval of documents in Apache Tomcat
- Used Sqoop to import data into HDFS and Hive from other data systems.
- Knowledge transfers sessions on the developed applications to colleagues.
Environment: Windows 7, SharePoint 2013, Apache 1.2, Eclipse, Pig, Hive, Flume, Sqoop, HBase, Putty, HPALM, WinSCP, Agile, Cloudera
Confidential, Warren, MI
Java Developer
Responsibilities:
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Implemented the project according to the Software Development Life Cycle (SDLC).
- Developed JavaScript behavior code for user interaction.
- Used HTML, JavaScript, and JSP and developed UI.
- Used JDBC and managed connectivity, for inserting/querying& data management including stored procedures and triggers.
- Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Sql Server database.
- Part of a team, which is responsible for metadata maintenance and synchronization of data from database.
- Involved in the design and coding of the data capture templates, presentation and component templates.
- Developed an API to write XML documents from database.
- Used JavaScript and designed user-interface and checking validations.
- Developed Junit test cases and validated users input using regular expressions in JavaScript as well as in the server side.
- Developed complex SQL stored procedures, functions and triggers.
- Mapped business objects to database using Hibernate.
- Wrote SQL queries, stored procedures and database triggers as required on the database objects
Environment: Windows 7, SharePoint 2010, SQL, Java, SCRUM, JSP, Visual Studio, Agile/scrum, Eclipse
Confidential, NYC
Java Developer
Responsibilities:
- Involved in almost all the phases of SDLC.
- Complete involvement in Requirement Analysis and documentation on Requirement Specification.
- Developed prototype based on the requirements using Struts2framework as part of POC (Proof of Concept)
- Prepared use-case diagrams, class diagrams and sequence diagrams as part of requirement specification documentation.
- Involved in design of the core implementation logic using MVC architecture.
- Used Apache Maven to build and configure the application.
- Configured struts.xmlfile with required action-mappings for all the required services.
- Developed implementation logic using struts2 framework.
- Developed JAX-WS web services to provide services to the other systems.
- Developed JAX-WS client to utilize few of the services provided by the other systems.
- Involved in developing EJB 3.0 Stateless Session beans for business tier to expose business to services component as well as web tier.
- Implemented Hibernate at DAO layer by configuring hibernate configuration file for different databases.
- Developed business services to utilize Hibernate service classes that connect to the database and perform the required action.
- Developed JSP pages using struts JSP-tags and in-house tags to meet business requirements.
- Developed JavaScript validations to validate form fields.
- Performed unit testing for the developed code using JUnit.
- Developed design documents for the code developed.
- Used SVN repository for version control of the developed code.
Environment: HPALM, Java, PL/SQL, XML, Agile/scrum, MS SharePoint 2010, Junit, JSP, XML
Confidential
Junior Java Developer
Responsibilities:
- Involved in various phases of Software Development Life Cycle (SDLC) such as requirements gathering, analysis, design and development.
- Involved in overall performance improvement by modifying third party open source tools like FCK Editor.
- Involved in development and enhancement of web client.
- Involved in enhancements and optimization in Business logic.
- Meeting stringent Deadlines.
- Analyzing the specific process requirements.
- Integrating with different modules.
- Developing the Java code.
- Used Servlets and JSP.Involved in AJAX implementation.
- Testing of the application - unit testing integration testing.
Environment: Test Director, Java, Java script, Oracle8i,, Windows 95/NT, Oracle, Tomcat Windows, XML