We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Having 10 years of IT experience and expertise in Hadoop, HDFS, HBase, Hive, Sqoop, Oozie, SQL, PLSQL, Teradata, Netezza, Sql Server with hands - on project experience in various Vertical Applications which includes Banking, Financial Services, Department of Health & Education, and eSales.
  • Highly experienced as Big Data Engineer with deep understanding of the Hadoop Distributed File System and Eco System (HDFS, MapReduce, Hive, Sqoop, Oozie, Zookeeper, HBase, Flume, PIG, Apache storm, solar and Apache Kafka) in a range of industries such as Banking and Financing sectors.
  • Well versed with all stages ofSoftware Development Life Cycle (SDLC)i.e. Requirement(s) gathering & analyzing, Design/redesign, Implementation and Testing.
  • Hands on experience in developing Map Reduce programs using Apache Hadoop for analyzing the Big Data.
  • Experience in Big DataHadoopEcosystems experience in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in optimization techniques in sorting and phase of Map reduce programs, and implemented optimized joins that will join data from different data sources.
  • Experience in importing and exporting data from RDBMS to HDFS, Hive tables, HBase by using Sqoop.
  • Hands on experience migrating complex map reduce programs into Apache Spark RDD transformations.
  • Experienced in defining job flows managing and reviewing Hadoop log files.
  • Load and transform large sets of structured, semi structured and unstructured data and Responsible to manage data coming from different sources.
  • Experience of job workflow scheduling and monitoring tools like oozie and Zookeeper.
  • Automated all the jobs, for pulling data from FTP server to load data into Hive tables, using Oozie workflows.
  • Experience in writing with Map Reduce programs using Apache Hadoop for working with Big Data.
  • Experience in reviewing and managing logs.
  • Proficient in using RDMS concepts with Oracle, SQL Server, Teradata, Netezza.
  • 4+ years of Experience as Oracle Developer for System Analysis, Designing, Testing Development & Support of Oracle 8i, 9i, 10g & 11g in Production, Staging, Development environments.
  • Having extensive experience and expert-level coding abilities of Oracle, SQL & PL/SQL, Teradata.
  • Hands on experience in SQL, PL/SQL Programs, Packages, Stored Procedures, Triggers, Cursors, Dynamic SQL, SQL*Loader, SQL*Plus, UNIX Shell scripting, Performance tuning and Query Optimization.
  • Involved in database design, tuning, triggers, functions, materialized views, Oracle Job Scheduler, Oracle Advanced Queuing.
  • Efficient in writing complex SQL queries, Hierarchical queries, use of analytical functions, Regular expressions and also aware of new features in oracle 11g.
  • Effectively made use of PL/SQL features such as Bulk Collections/Exceptions/Inserts, Bulk Binding, Ref-Cursors, Multi Table inserts, SQL Types in Bulk Operations for better performance and readability
  • Expert in database skills using SQL, TOAD for debugging applications.
  • Experience in performance Tuning and Explain plan, SQL loader, DBMS Scheduler, Utl File, Triggers, Indexes, Import and Export utilities and Shell scripting for auto process.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistant, Aginity, Toad for Oracle, Toad for Sql Server, Toad Data Point.
  • Documentation of projects for Functional Requirement Spec (FRS), Use case Spec, HLD/LLD, Users On-boarding doc, Run book, .
  • Have worked in fast paced AGILE environment & attended daily scrum meeting.
  • Well-versed with different stages of Software Development Life Cycle (SDLC).
  • Proactively interacted with Business Analytical, technical analysts, Developers, Testers and External clients.
  • Excellent analytical, problem solving, communication and inter-personal skills to manage and interact with individuals at all levels. Able to interact effectively with other members of the Business Engineering, Quality Assurance and other teams involved with the System Development Life cycle.

TECHNICAL SKILLS

Big Data / Hadoop: HDFS, HadoopMapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie,Spark, Apache Kafka,Apache

Databases: HiveQL, Sqoop, Oracle (SQL, PL/SQL), Teradata, Netezza, SQL Server, MS-SQL, XML, HTML, CSS.

Methodologies: Agile, Waterfall model

Query Tools: TOAD, PL/SQL Developer, SQL Developer, and SQL* PLUS, ETL.

GUI Tools: Oracle Forms 6i / 9i /10g / 11g, Oracle Reports 6i / 9i /10g / 11g.

Operating Systems: Windows 95/2000/XP/Vista/Windows 7, UNIX, LINUX 5.5

Other Tools: SVN, Putty, Super Putty, TWS, PVCS, WinSCP.

PROFESSIONAL EXPERIENCE

Confidential - Charlotte, NC

Hadoop Developer

Responsibilities:

  • Involved in writing MapReduce jobs.
  • Used Hive to do transformations, event joins, filter both traffic and some pre-aggregations before storing the data onto HDFS.
  • Involved in developing Hive queries and UDFs for the needed functionality that is not out of the box available from Apache Hive.
  • Involved in using SQOOP for importing and exporting data into HDFS and Hive.
  • Involved in extracting user’s data from various data sources intoHadoopHDFS.
  • Implemented Commissioning and Decommissioning of new nodes to existing cluster.
  • Developed MapReduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive schema for analysis.
  • Used Oozie workflow engine to manage interdependentHadoopjobs and to automate several types ofHadoopjobs such as Java MapReduce, Hive and Sqoop as well as system specific jobs.
  • Using Avro and Parquet in MapReduce Jobs with Hadoop, Sqoop, Hive, Impala.
  • Collecting and aggregating large amounts of log data of staging data in HDFS for further analysis.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Participated in evaluation and selection of new technologies to support system efficiency.
  • Participated in development and execution of system and disaster recovery processes.
  • Involved in preparing the Proof of Concept and the Presentations to demonstrate the solution to the business users on Data Integration.
  • Working on Agile scrum methodologies.
  • Analyzing new opportunities for my group. This include daily interaction with team to understand the business flow and analyze the application of technology to increase the time efficiency in a business work flow.

Environment: s: Hadoop, Hive 1.2, Oozie, SQL Developer, TOAD, Oracle, Data Point, Agile - Version One, Windows 8, Unix, Teradata SQL Assistant, Aginity, SQL Server.

Confidential - Albany, NY

Hadoop and Oracle Developer

Responsibilities:

  • Played the role in understanding the user requirement for Regional Office and how it is related to existing NYSE-CON project.
  • Played the role in developing the application using PL/SQL.
  • Involved in complete SDLC life cycle of big data project that includes requirement analysis, design, coding, testing and production
  • Developing Scripts and Auto sys Jobs to schedule a bundle (group of coordinators), which consists of variousHadoop Programs using Oozie.Work with the Data Base Specialist and Technical Architect on the design work of the application.
  • Created hive tables defined with appropriate static and dynamic partitions, intended for efficiency and worked on them using HIVE QL.
  • Used Sqoop to import data from RDBMS into hive tables.
  • Used to manage and review Hadoop logs.
  • Responsible for moving the source code to Production.
  • Involved in gathering the requirements, Documenting and Review from the work streams & performance teams.
  • Involved in activity of VISIO diagrams for the complete flow of this application.
  • Involved in the mock up design work with the Java Architect and Analyst for the UI.
  • Responsible for moving the source code to UAT.
  • Responsible for installation of Oracle software on Windows.
  • Other duties as assigned.

Environment: s: Hadoop, MapReduce, HDFS, Hive, Pig, Linux, XML, Cloudera, CDH3/4 Distribution, Oracle 11i, MySQL, Flume, Oozie, Hbase

Confidential - Lafayette, IN

Sr.Data Analyst & Report Developer

Responsibilities:

  • Analyze, design, implement and test ad-hoc and standard reports using University Development systems.
  • Work with customers, which include Development Officers, Purdue Alumni Association and the Development Senior Team, to provide timely, accurate, consistent and meaningful data utilizing a variety of reporting tools.
  • Created and maintained Tables, views, procedures, functions, packages, DB triggers, and Indexes.
  • Develop special reports using SQL, Cognos Reporting Suite and other Business Intelligence tools (e.g. Advizor) as necessary.
  • Create highly complex prompted reports, troubleshoot, and fix reports.
  • Provide report content for ‎‎approval and submission to outside entities that include Council for Aid to Education (CAE), Council for Advancement & Support of Education (CASE) and other publications.
  • Design and model data projects and partner with IT Enterprise Applications to create reporting structures.
  • Work with programmer analysts in IT Enterprise Applications and the Systems Administration team to assist in testing system changes.
  • Prepare major annual reports and assist in special reporting projects both internally and externally.
  • Create and maintain report documentation.
  • Interact with clients to understand reporting needs and assist in optimizing outcomes.
  • Analyzing requests for information and recommend standard or best practice report formats.
  • Provide reports as necessary for various customers (Development Officers, Purdue Alumni Association and Development Senior Team).
  • Direct internal customers to existing report tools whenever possible to meet reporting needs.
  • Assist the Assistant Director of Advancement Information Systems as assigned on special projects and/or fulfilling special ‎customer ‎needs.

Environment: s: Oracle 10g, PL/SQL Developer 10.0.2, Waterfall, Windows

Confidential - Cincinnati, OH

Hadoop Developer

Responsibilities:

  • Completely involved in Requirement Analysis and documentation on Requirement Specification.
  • Involved in project from analysis to production implementation, with emphasis on identifying the source and source data validation, developing logic and transformations as per the requirement and creating mappings and loading the data into target tables.
  • Used flume script to move streamed data to HDFS.
  • Moved data from Hive tables into Cassandra file system for real time analytics.
  • Used Cassandra Query language (CQL) to perform operations against Cassandra data.
  • Used Sqoop to import data from RDBMS into hive tables.
  • Developed map reduce jobs using java to preprocess data.
  • Created hive internal/external tables and worked on them using HIVE QL.
  • Used to manage and review Hadoop logs.
  • Responsible for all the post production support / Bug fixes etc.
  • Responsible for managing data coming from different data sources.
  • Developed the framework, reusable components and utility classes, which are commonly used in the project.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • Responsible for moving the source code to UAT.
  • Involved in Debugging and resolving the problem.

Environment: s: Hadoop, MapReduce, HDFS, Hive, Pig, Linux, XML, Cloudera, CDH3/4 Distribution, Oracle 11i, Flume, Oozie, Hbase

Confidential - Rochester, NY

PLSQL Developer

Responsibilities:

  • Analysis of business requirements and major developments.
  • Involved in gathering the requirements, Documenting and Review from the CDB/VBS. Teams from offshore and onshore.
  • Handling the CDB database completely.
  • Responsible for moving the source code to Production.
  • Involved in Production Changes is the major part.
  • Scheduled review & status updates meetings with the stakeholders and off shore team.
  • Scheduled XMX/IMM meetings weekly and MOM process monthly.
  • Based on the feedback, small enhancements have been implemented into the CDB/VBS.
  • Used XMS application
  • Involved in resolving the problem and requesting a solution in Problem Management and Change Management using MKS.
  • Lead experience with the offshore and onsite from India Development team.
  • Worked with Oracle Business Object Developers in planning sessions with PM, BA and team members to analyze business requirements.
  • Worked with Test Environment Management team in solving the raised issues.
  • Supported developers for efficient SQL query and performance and maintain the database using PL/SQL.
  • Created, maintained the Contract Database as per the required schedule dates and get the approvals before moving to production.
  • Given transition to the new project mates.

Environment: s: Oracle 10g, TOAD, SQL Developer 3.1, Putty 0.62, RCS, Windows 7, SQL*Plus, MY SQL, UNIX, Forms & Reports10g, Citrix, MKS, Agile.

Confidential, MN

Tech Lead/Project Coordinator

Responsibilities:

  • Maintaining the Operation Readiness Review database for the LVTS.
  • Analysis of business requirements and major developments.
  • Involved in gathering the requirements, Documenting and Review from the SLM’s, work streams & performance testing teams.
  • Handling the ORR database completely.
  • Responsible for moving the source code to Production.
  • Involved in Pre-Production Changes that are part of a Pre-Production Environment.
  • Scheduled review & status updates meetings with the stake holders and off shore team.
  • Based on the feedback, small enhancements have been implemented into the ORR database.
  • Involved in resolving the problem and requesting a solution in Problem Management and Change Management.
  • Created ManageNow id’s as per the requirement.
  • Lead experience with the offshore and onsite from India/ China Development team.
  • Worked with Oracle Business Object Developers in planning sessions with PM, BA and team members to analyze business requirements.
  • Replicated Databases, using database links, streams for disaster recovery.
  • Worked with Test Environment Management team in solving the raised issues.
  • Supported developers for efficient SQL query and performance.
  • I am involved in developing Oracle Reports and maintaining the database using PL/SQL.
  • Created the Environment Tasks Plan for Monitoring, Readiness, DR setup & other tasks.
  • Created the Batch jobs using TWS.
  • Involved in writing the shell scripts for checking the web, application provided production support and played a major role in debugging errors during implementation and deployment.
  • Given transition to the new project mates.

Environment: Oracle 10g, ORR database, GIM Application, TWS, SQL & PL/SQL Development, Forms & Report, Dynamic SQL, SQL*Loader, Agile, Oracle SQL Developer, UNIX Shell Scripting and Windows XP.

Confidential

PLSQL Developer

Responsibilities:

  • Developing the Oracle reports based on the project requirements.
  • Conducting the unit testing and performance testing on the reports.
  • Experience in Oracle Backend programming in SQL and PL/SQL, Procedures, functions, triggers and packages in Oracle database server and also for Oracle Application Server using PL/SQL.
  • Wrote DDL Scripts to create new database objects like tables, views, sequences, synonyms, and indexes.
  • Created Materialized views and Materialized view logs.
  • Wrote extensive Exception handling procedures to handle exceptions.
  • Generated monthly reports in different formats like tabular, form and matrix using Oracle Reports.
  • Created LOV’s for data elements in Oracle Forms.
  • Created Master/Detail forms.
  • Wrote SQL scripts to perform Back End testing of the Oracle database using SQL.
  • Involved in writing Unit Test Cases and Test Scripts for Manual Testing from Use Cases.
  • Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle database.
  • Attended Design meetings, code review and test review meetings.
  • Data loading from other databases and text files into Oracle Database using SQL*LOADER.
  • Extensive experience in loading high volume data and Performance tuning.
  • Experience in Performance Tuning using various types of Hints, Partitioning and Indexes.
  • Expertise in handling errors using Exceptions.
  • Attended Design meetings, code review and test review meetings.
  • Attended meetings and review to discuss current issues and processes to tackle issues.
  • Worked as a Developer, did requirement analysis, design, developed (Oracle - PL/SQL), tested and implemented and provided infant care on production.
  • Gathered the requirement from the client and translated the business design into technical design.
  • Involved in creating Procedures, Functions, Packages, and Triggers. Creating reports using Reports 6i.
  • Extensively involved in writing SQL queries (sub queries and join conditions) for generating complex reports.
  • Involved in the development of UNIX shell scripts.
  • Performed tuning of the SQL queries using Explain Plan.

Environment: Oracle 9i, SQL & PL/SQL Development, Oracle Forms and Reports 6i, Agile, Waterfall, HTML & CSS, Windows 98/2000/XP.

We'd love your feedback!