Sr. Hadoop Developer Resume
Champaigne, IL
SUMMARY:
- 7+ year of experience in teh Information Technology Industry.
- Strong exposure to IT consulting, software project management, design, development, implementation, maintenance/support and Integration of software applications.
- Around 2 years of experience in Big Data processing using Apache Hadoop.
- 5+ years of experience designing, developing, deploying & supporting large scale distributed systems using various technologies.
- Expert in created PIG Latin Scripts and UDFs using JAVA for analysis of data efficiently.
- Expert in creating Hive Queries and UDFs using Java for analysis of data efficiently.
- Knowledge of HDFS, Map/Reduce, Pig, Hive, Sqoop and Cloudera’s Hadoop distribution.
- Expert in using Sqoop for fetching data from different systems and HDFS to analyze in HDFS, and again putting it back to teh previous system for further processing.
- Also used Hbase in accordance with PIG/Hive as and when required for real time low latency queries.
- Worked in Windows, UNIX/Linux platform with different Technologies such as BigData, SQL, PL/SQL, XML, HTML, Core Java, Python etc.
- Knowledge of creating Map Reduce codes in Java as per teh business requirements.
- Good experience in Oozie Framework and Automating daily import jobs.
- Having working noledge on Hadoop Administration.
- Good understanding of NoSQL Databases
- Hands on work experience in writing application on No SQL database like Cassandra.
- Worked in ETL tools like Talend to simplify mapreduce jobs from teh front end.
- Worked with Tableau for report creation and further analysis from teh front end.
- Knowledge of Pentaho report creation and analysis.
- Extensive noledge on Oracle SQL.
- Knowledge of Java frameworks such as Hibernate, Spring.
- Knowledge of RESTful WebServices.
- Good Knowledge of MySQL and administering it with MySQL workbench and XAMPP installations.
- Experienced in design and implementation of applications using Java.
- Strong noledge of Software Development Life Cycle (SDLC).
- Experienced in creating and analyzing Software Requirement Specifications (SRS) and Functional Specification Document (FSD).
- Experienced in provided training to team members as new per teh project requirement.
- Good Knowledge of Microsoft Office Suite including MS Access and MS Excel.
- Working on development projects which include design, development and unit testing of applications.
- Experienced in creating Product Documentation & Presentations.
- Ability to effectively communicate with all levels of teh organization such as technical, management and customers.
- Possess strong commitment to team environment dynamics with teh ability to contribute expertise and follow leadership directives at appropriate times.
- Strong interpersonal skills resulting in exceptional rapport with people. Proven success in initiating, promoting and maintaining strong interpersonal relations. Able to deal courteously, professionally, and tactfully with teh general public in a variety of circumstances.
- Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.
- Adaptive in learning and working on various technologies.
TECHNICAL EXPERTISE
Operating Systems: Windows, UNIX/Linux.
Databases: Oracle (9i, 10g), MySQL, MS Access, NoSQL
Technologies: Hive, Pig, MapReduce, Hbase, Sqoop, Java, SQL, C,C++,, Python
BigData Distribution: Cloudera
Middleware Tools: SQL Developer, MySQL Workbench, Apache Tomcat, Putty, WinSCP etc.
BI/ETL Tools: Talend, Tableau, Knowledge of Pentaho
Java IDE: Eclipse, NetBeans
Other Tools: Matlab, Visual Studio, Notepad++
PROFESSIONAL EXPERIENCE
Confidential, Champaigne, IL
Sr. Hadoop Developer
Roles and Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- Experience in installing, configuring and using Hadoop Ecosystem components.
- Experience in Importing and exporting data into HDFS and Hive using Sqoop.
- Experienced in defining job flows.
- Experienced in managing and reviewing Hadoop log files.
- Participated in development/implementation of Cloudera Hadoop environment.
- Load and transform large sets of structured, semi structured and unstructured data.
- Experience in working with various kinds of data sources such as Oracle.
- Successfully loaded files to Hive and HDFS from Oracle.
- Responsible for managing data coming from different sources.
- Gained good experience with NOSQL database.
- Supported Map Reduce Programs those are running on teh cluster.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries, which will run internally in map, reduce way.
- Exported teh analyzed data to teh relational databases using Sqoop for visualization and to generate reports for teh BI team.
Environment: Cloudera HDFS, Hadoop, MapReduce, Hive, Hive UDF Pig Latin, Java, SQL, Sqoop, Centos, NOSQL database.
Confidential, Virginia Beach, VA
Hadoop Developer
Responsibilities:
- Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing.
- Involved in loading data to HDFS and Hive using Sqoop.
- Installed Oozie workflow engine to run multiple Hive and Pig.
- Responsible for building scalable distributed data solutions using Hadoop.
- Handled importing of data from various data sources, performed transformations using Hive, Map - Reduce, and loaded data into HDFS.
- Loaded and transformed large sets of structured, semi structured and unstructured data.
- Installed and configured Pig and Hive.
- Created UDFs in Pig and Hive as well as scripts.
- Evaluated business requirements and prepared detailed specifications dat follow project guidelines required to develop written programs.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
Environment: Hadoop, MapReduce, HDFS, Pig, Hive, HBase, Oozie, Java (jdk1.6), Eclipse, Cloudera, Sqoop, Pig, Oracle 10g
Confidential, St. Louis, MO
Hadoop Developer
Responsibilities:
- Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools, including: Hive, Pig, HBase, Zookeeper and Sqoop.
- Involved in analyzing system failures, identifying root causes and recommended course of actions.
- Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.
- Managing and scheduling Jobs on a Hadoop cluster.
- Assisted in designing, building, and maintaining database to analyze life cycle of claim processing and transactions.
- Job Scheduling using Oozie and tracking progress.
- Analysis of Web logs using Hadoop tools for operational and security related activities.
Environment: Cloudera HDFS, Hive, Pig, HBase, Zookeeper, Sqoop, Java, JDBC, Subversion, JUnit, SQL, Oracle, XML, Putty and Eclipse, RESTful WebServices.
Confidential
Software Developer
Responsibilities:
- Involved in teh Analysis, development and testing process of teh application.
- Gathering functional requirements from teh client and mapping with business process
- Used Spring Framework in implementation.
- Developed teh JAVA code for business service and bean logic for teh application.
- Used tools like PL/SQL developer for writing SQL queries to do data manipulation.
- Implemented teh business service logic using teh design and functional requirements.
- Reviewing deliverables with respect to coding and performance standards.
- Involved in bug fixing and resolving issues with teh QA.
- Documented all teh process changes and requirement changes.
Environment: Java, J2EE, EJB 1.1, JSF, XML, JDBC, Oracle 9i, Log4J 1.2.,PL/SQL Developer, RESTful, Spring.
Confidential
Software Developer
Responsibilities:
- Involved in design sessions and developed use case documents and design documents.
- User various design patterns like Data Access Object and Model View Controller (MVC).
- Developed access, save and update procedures using Hibernate to access, save and update data in Oracle database.
- Developed Java code for various services and bean logic for application.
- Designed and developed required PL/SQL, stored procedure and triggers.
- Implemented teh business service logic using teh design and functional requirements.
- Reviewing deliverables with respect to coding and performance standards.
- Coordinated with QA team for resolution of Quality issues.
- Documented all teh process changes and requirement changes.
Environment: Java, J2EE, EJB 1.1, JSF, XML, JDBC, Oracle 9i, Log4J 1.2.,PL/SQL Developer, Hibernate.
Confidential
Jr. Software Developer
Responsibilities:
- Involved in business requirement gathering, analysis, feasibility research of teh project, estimation and enhancements.
- Involved in High-Level Design, Detailed Design, Coding, Preparation of Test Plan, Unit and System testing of teh project.
- Involved in teh analysis and teh estimation of Enhancements.
- Impact analysis of minor to major enhancements on teh existing system
- Involved in teh detail design and testing of teh Enhancements.
- Conducted Design, code walkthroughs, unit testing, regression testing, acceptance testing, integration testing and system testing.
- Involved in writing complex SQL queries to validate converted date.
- Responsible for Planning & tracking of teh enhancement.
- Responsible in deploying teh application into System Test Environment.
- Ensured seamless deployment of application into production.
- Communicated with business users and resolved teh issues during acceptance testing.
Environment: Oracle, PL/SQL, Core Java, Windows, MS Office.