Hadoodp Developer Resume
3.00/5 (Submit Your Rating)
SUMMARY
- 13 years of experience in Data Integration Implementation for Enterprise Data Warehouse, Master Data Management, Data Lake Ingestion, ETL and ELT solutions in traditional & Hadoop eco systems.
- Extensively worked on Cloudera Hadoop platform using HiveSQL, Impala, Sqoop & Oozie.
- Experience in ingesting Mainframe VSAM files into Hadoop Data Lake using DataStage 11.5.
- Hands on experience in designing and creating Near Real Time Jobs using MQ connectors in DataStage 9.1.
- Extensive experience in building and maintaining Data Warehouses systems using IBM Datastage v11.5/v9.1/v8.5/v8.1.x/v7.5 (Administrator, Director, Manager, Designer).
- Extensively worked on building DataStage jobs to process, merge the data using various stages like Transformer, Modify, Funnel, Join, Merge, Look - up, Pivot, Aggregator etc.
- Experience in creating DDL scripts for Operational Data Store and ETL Source to Target mapping documents.
- Extensively worked on Oracle 9i, Teradata 7, IBM DB2 Z/OS 9.7 and IBM UDB 8.0 .
- Worked on defining dependency and scheduling DataStage jobs using enterprise-scheduling tools such as Control-M, ESP and TWS.
- Created Oozie Workflows to invoke and operationalize multiple different actions like Hive and Shell actions
- Expertise in writing DB2 SQL’s and UNIX shell scripts.
- Designed and developed Managed and External tables in Hive.
- Experience in Designing, developing and implementation of various ETL frameworks with source integrations, Staging, Homogenization, Error and Audit process
TECHNICAL SKILLS
- Cloudera Hadoop platform - Hue, Hive, Impala, Sqoop & Oozie. IBM BigInsights Hadoop platform i.e., Ambari
- Rally
- Tableau & Webfocus
- IBM DataStage Enterprise Edition 8.1, 8.5, 8.7 and 9.1
- SQL, HiveQL, Impala, SparkSQL
- Erwin
- Microsoft Visio for creating DFDs ()
- Control-M and ESP
- IRIS (Worldpay inhouse), HP Service Manager and BMC Remedy
- IBM DB2 v9.7, Oracle 10, Teradata, SQL Server
- UNIX and LINUX
PROFESSIONAL EXPERIENCE
Confidential
Hadoodp Developer
Responsibilities:
- Project planning, Status reporting, Team management, Work allocation, Analysis, Design, Coding, Debugging Unit testing and system testing.
- Working in a highly paced dynamic environment focusing multiple projects and manage strategic and complex projects, enhancement of client facing applications, and on boarding new projects.
- Developed complex ELT/ETL solutions in Hadoop platform using Hive SQL and Impala.
- Developed Oozie Workflow to manage and schedule the data flow.
- Used Sqoop to extract data from relational databases into Hadoop environment.
- Involved in the end-to-end SDLC, from gathering and analyzing requirements, creating use cases, working with clients to answer developer clarifications, conducting user acceptance testing, deployment, and post deployment defect tracking.
- Conduct review sessions and finalize the execution strategy with project stakeholders
- Review the performance technique and best practices