Etl Developer Resume Profile
NY
Summary:
- With over 8 years of experience in Information Technology, currently working as ETL/Hadoop developer. Involved in architecting, consulting and providing software solutions t businesses across various domains including clients such as New York Times, Johnson Johnson, TIAA-CREF, HISNA, and Hyundai Capital.
- Cloudera Certified Apache Hadoop Developer CCDH -410
- Areas of interests include: Distributed computing, Customer Insights, e-Commerce, Trend analysis, Sentimental analytics, Data Mining, Cloud Computing, Functional and Technical evaluations of New Products.
- Close t 2 years of experience exclusively on Hadoop Eco-System viz., HDFS, MapReduce, Hive, Pig, Flume, Oozie, Impala and Sqoop.
- Architecting and configuring Cloudera Manager as a single-node and multi-node Hadoop cluster CDH4/CDH5/EC2/EMR which als includes setting up Hadoop cluster on AWS-EC2/S3.
- Hands on experience on Various NoSQL databases such as Hbase and MongoDB.
- Vast experience on ETL Architecture, analysis, design, development, testing ,implementation, maintenance and support of Siebel Warehouse, Enterprise Data Warehouse EDW ,Business Intelligence BI solutions using OLAP.
ETL: Informatica 6.x, 7.x, 8.x and 9.x
- Design and development of an enterprise wide Data warehouse.
- Informatica server administration and platform upgrades
- ETL Performance tuning involving Query Optimization, Memory Management and Network Bandwidth Utilization.
Business Intelligence:
- Design, Configuration and trouble shooting of Oracle BI Suite 7.x, 10.x, 11.x
- Techno-Functional consulting on Data visualization for Oracle BI Dashboards.
- OBIEE server administration.
- Worked on various schedulers including Oracle DAC, Autosys, ControlM.
- Database: Oracle 10g, 11g, 9i, Teradata, Netezza MS SQL.
- OS/scripting: Unix Shell Scripting and Windows batch scripting.
- Project execution: Requirements gathering, Impact Analysis, Solutions proposals as per Industry standards, On-shore/offshore coordination, end user interaction, release management, translation of business requirements int technical design, worked on enterprise wide application deployment guidelines.
- Project Implementation: Involved in all phases of Software Development Life Cycle.
Professional Experience:
Confidential
ETL/Hadoop Developer
Roles and Responsibilities:
- Requirements gathering, analysis and Design coordination.
- Architected and built a scalable distributed Data Solutions System using Hadoop Ecosystem Hadoop, Pig, Hive, Sqoop, Impala, HBase .
- Implemented a 50 node Cloudera CDH4 Hadoop clustered on AWS SUSE Linux .
- Extensively worked on PIG scripts Pig UDFs t perform ETL activities.
- Worked on Pig Scripts t create file dumps for downstream analytical processing.
- Worked on Performance tuning related t Pig queries.
- Created Hive Tables and worked on Hive QL UDFs for all adhoc reporting.
- Hands on experience on Java MapReduce jobs t handle different types of files including Text, XML, JSON.
- Worked on Sqoop jobs in order t export/import data from/int HDFS.
- Handled incremental data loads from RDBMS int HDFS using Sqoop.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
- Used Apache Maven extensively for building jar files of MapReduce programs and deployed t Cluster.
- Hands on experience on RHadoop for statistical analysis of IVR data.
- Hands on experience on machine learning and NLP.
- Responsible for all admin activities including Administering Cluster health, adding/decommissioning of data nodes, Job scheduling/monitoring.
Tools Used:
Hadoop 0.23.9, Hive 0.9.0, Pig 0.10.0, Oozie 2.3.2, AWS, Sqoop 1.3.0, Suse Linux, Informatica 8.1.1, 9.5.1, Oracle 10g and 11g, SQL developer, Unix, Sun Solaris
Confidential
Senior BI Developer
Roles and Responsibilities:
- Worked on upgradation of OBIE E 10.1.3.4 t OBIEE 11g.
- Primary roles included Onsite Co-ordinator and lead for requirement gathering, impact analysis and migration of reports and dashboards from OBIEE 10.1.3.4 t 11g.
- Loading data from Flat files and other databases.
- Restructuring of OBIEE repository t complement the upgraded Datawarehouse
- Data analysis and adhoc report generation.
- Requirement gathering, Impact Analysis, Data analysis, integration, Solution proposals, business technical translation, Low Level Design and Onshore/Offshore Co-Ordination.
- Unit and SIT testing.
Tools Used:
Informatica 8.6.1, OBIEE 10.1.3.4, OBIEE 11g, Oracle 10g, Linux, Toad, SQL Developer, Netezza
Confidential
Data Integration Analyst
Roles and Responsibilities:
- Design, development, troubleshooting and enhancing Informatica t extract and load data int Datawarehouse.
- Design of Star and Snow Flake schemas, SCDs t fit business needs.
- Loading data from Flat files and other database.
- Building adhoc OBIEE reports for forecasting and data analysis.
- Requirements gathering, Impact Analysis, Data analysis, integration, Solution proposals, business technical translation, Low Level Design and Onshore/Offshore Co-Ordination.
- Informatica DAC administration.
- SQL Tuning and performance analysis.
- UNIX shell scripting and job scheduling using Autosys and CA7.
Tools Used:
Informatica 8.6.1, OBIEE 10.1.3.4, Oracle 10g, Teradata, Unix, Toad, SQL Developer, Autosys, CA7
Confidential
ETL Developer
Roles and Responsibilities:
- Worked on Datawarehouse design
- Requirement gathering, design and configuration of Informatica Application for all ETL activities pertaining t different entities for reporting.
- Developed Informatica workflows for all the data flows from Siebel and external systems t SRMW Datwarehouse.
- Unit testing, System Integration testing on all the Informatica jobs for the OLAP.
- UNIX shell scripting t automate the Informatica jobs.
Tools Used:
OBIEE 10.1.3.4, Informatica Power Center 7.1.3, Win NT, Oracle, and UNIX
Confidential
Data Integration Analyst
Roles and Responsibilities:
- Impact and Gap analysis for the Informatica upgrade from 6.2.2 t 7.1.2
- Customization of all Informatica Mappings, Workflows, batch and Shell scripts.
- Informatica Mapping and workflow development, troubleshooting and deployment.
- Data analysis, integration, cleansing and database migration.
- Informatica Application administration.
- Coordination with Onshore teams.
Tools Used:
Siebel Analytics 7.8.5.1, Informatica 6.2.2, Siebel Analytics 7.8.5.2, Informatica 7.1.2, DAC 7.8.4, Oracle 10g, Toad, SQL Developer, UNIX
Confidential
ETL Developer
Roles and Responsibilities:
- Requirements gathering, preparation of High Level Design and Low Level Design Documents.
- Development of Informatica Mappings and Workflows t load data int Datawarehouse.
- Unit Testing, Systems Integration testing
- Unix Shell scripting for scheduling and automation of ETL jobs.
- Post production support for bugs and enhancements.
Tools Used:
Informatica Power Center 7.1.2, Win 2000, Oracle10g, OBIEE 10.1.3.4
Confidential
ETL Developer
Roles and Responsibilities:
- Development of Informatica jobs t import data from Legacy Mainframes int CRM Siebel Staging tables.
- Development of Siebel EIM jobs t load data int Siebel CRM.
- Post production support of bug fixes and enhancements related t the Informatica and Siebel EIM data loads.
- Unit and Systems Integration testing.
- Loading data from Flat files and other databases.
- Siebel EIM data administration and Data Mapping.
Tools Used:
Informatica Power Center 7.1.2, Win NT, Oracle9i, IBM AIX
Confidential
Data Integration Analyst
Roles and Responsibilities:
- Requirements gathering, design and development of Informatica ETL loads for OLAP Datawarehouse and Siebel EIM.
- Loading data from Flat files and other database.
- Requirements gathering.
- Data analysis, integration, cleansing and database migration.
- Siebel EIM and Data Mapping.
- UNIX shell scripting and job scheduling using Redwood tool.
- Coordination with Onshore teams.
Tools Used:
Informatica Power Center 7.1.2, Win NT, Oracle9i, IBM AIX