Sr. Hadoop Developer Resume
Tampa, FL
SUMMARY
- 7+ year of experience in the Information Technology Industry with strong exposure to IT consulting, software project management, design, development, implementation, maintenance/support and Integration of software applications.
- Over 3 years of experience in Big Data processing using Apache Hadoop and its ecosystem.
- Expert in created PIG Latin Scripts and UDFs using JAVA for analysis of data efficiently.
- Expert in creating Hive Queries and UDFs using Java for analysis of data efficiently.
- Knowledge of HDFS, Map/Reduce, Pig, Hive, Sqoop and Cloudera's Hadoop distribution.
- Expert in using Sqoop for fetching data from different systems and HDFS to analyze in HDFS, and again putting it back to the previous system for further processing.
- Also used Hbase in accordance with PIG/Hive as and when required for real time low latency queries.
- Worked in Windows, UNIX/Linux platform with different technologies such as BigData, SQL, PL/SQL, XML, HTML, CSS, Java Script, Core Java, Python etc.
- Knowledge of creating Map Reduce codes in Java as per the business requirements.
- Good experience in Oozie Framework and Automating daily import jobs.
- Having working knowledge on Hadoop Administration.
- Good understanding of NoSQL Databases
- Hands on work experience in writing application on No SQL database like Cassandra.
- Worked in ETL tools like Talend to simplify mapreduce jobs from the front end.
- Worked with Tableau for report creation and further analysis from the front end.
- Knowledge of Pentaho report creation and analysis.
- Extensive knowledge on Oracle SQL.
- Good Knowledge of MySQL and administering it with MySQL workbench and XAMPP installations.
- Experienced in design and implementation of applications using Java.
- Around 4 years of experience in Core Java programming with hands - on in Java Server Pages (JSP) framework and web technologies like HTML, JavaScript, CSS and XML.
- Strong knowledge of Software Development Life Cycle (SDLC).
- Experienced in provided training to team members as new per the project requirement.
- Good Knowledge of Microsoft Office Suite including MS Access and MS Excel.
- Working on development projects which include design, development and unit testing of applications.
- Experienced in creating Product Documentation & Presentations.
- Ability to effectively communicate with all levels of the organization such as technical, management and customers.
- Possess strong commitment to team environment dynamics with the ability to contribute expertise and follow leadership directives at appropriate times.
- Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.
- Adaptive in learning and working on various technologies.
- Excellent interpersonal skills, good experience in interacting with clients with good team player and problem solving skills.
TECHNICAL SKILLS
Hadoop/Big Data Technologies: Hadoop, HDFS, MapReduce, HBase, Pig, Hive, Sqoop, Flume, Cloudera, Oozie, Avro and Zookeeper
Programming Languages: Java, C, C++ and COBOL
Scripting/Web Technologies: JavaScript, HTML, XML, Python, Shell Scripting and CSS
ETL/BI Tools: Pentaho, SAS Enterprise Miner, Rapid Miner and Weka
Databases: Oracle 9i/10g/11g, MySQL and NoSQL
Operating Systems: Linux, Unix and Windows
Java IDE: Eclipse and NetBeans
Additional Skills: Data mining and Business Intelligence Algorithms, Talend, Tableau and R
PROFESSIONAL EXPERIENCE
Sr. Hadoop Developer
Confidential - Tampa, FL
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
- Experience in installing, configuring and using Hadoop Ecosystem components.
- Experience in Importing and exporting data into HDFS and Hive using Sqoop.
- Experienced in defining job flows.
- Experienced in managing and reviewing Hadoop log files.
- Participated in development/implementation of Cloudera Hadoop environment.
- Load and transform large sets of structured, semi structured and unstructured data.
- Experience in working with various kinds of data sources such as Oracle.
- Successfully loaded files to Hive and HDFS from Oracle.
- Responsible for managing data coming from different sources.
- Gained good experience with NOSQL database.
- Supported Map Reduce Programs those are running on the cluster.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries, which will run internally in map, reduce way.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
Environment: Cloudera HDFS, Hadoop, MapReduce, Hive, Pig Latin, Java, SQL, Sqoop, Centos, NOSQL database.
Hadoop Developer
Confidential, St.Louis, MO
Responsibilities:
- Importing and exporting data into HDFS and Hive using Sqoop
- Involved in defining job flows, managing and reviewing log files.
- Extracted files from Oracle through Sqoop and placed in HDFS and processed.
- Load and transform large sets of structured, semi structured and unstructured data
- Responsible to manage data coming from different sources.
- Supported Map Reduce Programs those are running on the cluster
- Involved in loading data from UNIX file system to HDFS.
- Configured Pig and also written Pig queries and UDFs.
- Involved in creating Hive tables, loading with data and writing Hive queries which will run internally in Map Reduce way.
Environment: Java, Hadoop, MapReduce, Pig, Hive, Linux, Sqoop, Eclipse, AWS EC2, and Cloudera CDH3
Hadoop Developer
Confidential, Farmington, CT
Responsibilities:
- Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools,including: Hive, Pig, HBase, Zookeeper and Sqoop.
- Involved in analyzing system failures, identifying root causes and recommended course of actions.
- Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.
- Managing and scheduling Jobs on a Hadoop cluster.
- Assisted in designing, building, and maintaining database to analyze life cycle of claim processing and transactions.
- Job Scheduling using Oozie and tracking progress.
- Analysis of Web logs using Hadoop tools for operational and security related activities.
- Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
Environment: Cloudera HDFS, Hive, Pig, HBase, Zookeeper, Sqoop, Java, JDBC, Subversion, JUnit, SQL,Oracle, XML, Putty and Eclipse and RESTful WebServices.
Java Developer
Confidential, Birmingham, AL
Responsibilities:
- Involved in many phases of SDLC such as requirements analysis, preparation of specifications document, design, development, deployment and unit test
- Designed implementation logic for core functionalities and screen validations
- Developed service layer logic for core modules using JSPs, Involved in implementation of presentation layer logic using HTML, CSS, JavaScript and XML
- Design of Oracle database tables to store customer's general and billing details
- Used JDBC connections to store and retrieve data from the database.
- Development of complex SQL queries and stored procedures to process and store the data
- Used ANT, a build tool to configure application
- Developed test cases using HP Quality Center
- Involved in code deployment, unit testing and bug fixing.
- Prepared design documents for code developed and defect tracker maintenance.
Environment: Java Server Pages (JSP), Java, HTML, CSS, XML, JavaScript, Apache Tomcat, Ant, SQL and Shell Scripting
Software Engineer
Confidential
Responsibilities:
- Maintained the UI screens using web technologies like HTML, JavaScript, JQuery and CSS
- Worked on the entire SDLC while working on few enhancements
- Provided regular updates to the customers on the status of enhancements
- Coding of assigned task and peer review
- Documented the changes for future development projects
- Prepared test cases for QA team
- Collaborated with QA team in testing the applications
- Involved in code deployment, unit testing and bug fixing.
- Prepared design documents for code modified and ticket maintenance.
Environment: Java, HTML, CSS, XML, JavaScript, JQuery, Apache Tomcat, Ant, SQL,PL/SQL and Shell scripting