Lead/sr. Etl Developer Resume
Summary
- About 8 years of experience in IT industry in analysis, design, development, testing, maintenance and implementation of Enterprise Data Warehouse (EDW) applications with focus on ETL (Extraction Transformation and Loading) for high volume Data Warehousing and Business Intelligence projects.
- Worked as Lead/Data Warehouse Analyst/Sr.Developer/ Mentor/Developer on EDW applications.
- Highly skilled in planning, designing, developing and deploying Data Warehouse / Data Marts.
- Extensively used ETL methodologies and processes in an EDW environment using Informatica Power Center 8.1/7.1/6.2/5.1 and Ab Initio GDE1.14/1.13 tools.
- Experience on project analysis, design, development and testing of Informatica Mappings and sessions with strong focus on quality systems such as CMM & Six Sigma Methodologies.
- Used Informatica and Ab Initioto develop transformations, mappings and worked on process flows and scheduling.
- Good Knowledge of Data Stage
- Sound Knowledge in Ralph Kimball Approach, Dimensional Modeling concepts, Data warehouse design, Star Schemas and Snowflake Schemas combined with strong Data Analysis, Data Profiling and Meta Data skills.
- Thoroughly versed in UNIX Shell Scripting to support ETL processes automation and DW tasks.
- Experience working with Heterogeneous Source Systems like Database, .CSV files and Flat files.
- Hands on experience in writing, testing and implementation of triggers, stored procedures, functions, packages in Oracle using SQL and PL/SQL and on Oracle Partitioning Methods (Range), Materialized views and Optimization
- Sound knowledge of Teradata Architecture and hands on experience in writing Teradata SQL's, BTEQ Scripts, Macro's, tuning DB queries and making use of Teradata utilities such as Fast Load, Fast Export, Multi Load, TPUMP and Teradata Session Monitor Console
- Practical experience with working on multiple environments like Production, Development and Testing
- Hands on experience with scheduling tools Auto sys and Control - M
- Involved in Warehouse Production Support where business & time critical issues are addressed.
- Experience with Technical and Functional Documentation
- Strong exposure to Software Project Development full life cycle
- Creative Technical Problem-Solver. Ability to Multitask and meet milestones ensuring high quality & on-time delivery.
- Good Communication, Analytical and Documentation Skills.Understand new ideas and technical concepts quickly, converting them into meaningful results.
Technical Skills
ETL Tools: Informatica Power Center 8.x/7.x/6.x/5.x, Ab Initio GDE 1.15/1.14/1.13, Oracle Data Warehouse Builder, Data Stage
Databases: Oracle V8i, V9i, 10G and Teradata V2R5/6.
Reporting Tool: D3(Dell Data Direct), Business Objects
O/Systems: Windows NT/98/2000, UNIX and Red Hat Linux.
Languages: C, C++, JAVA, SQL, PL/SQL, HTML, Java Script, XML and UNIX Shell Script
Version Control: EME and VSS
Scheduling: Control-M 6.1, Autosys
Others: SQL Plus, PL/SQL Developer, Toad, Ctrl-M, Teradata SQL Assistant, Excel - Pivot tables, TOAD, XML
Testing Tools: Mercury Test Director 8.0, Mercury Quality Center
Education
- MCA
- Graduation
Details of Professional Experience:
Development Lead/Sr. ETL Developer Confidential, June 2005 - till date
Project: C2G (Cradle-to-Grave) September 2009- till date: C2G Phase-I provides Dell internal customers with a standardized data model, data elements required to report on complete end-to-end lifecycle of an inbound call including infrastructure that supports movement and ultimate delivery of call and near real-time and historical data availability through D3 for adhoc reporting. C2G reporting would provide an integrated view into CTI telephony statistics of calls. Also be able to trace routing and delivery of a transferred call. Reporting will provide data and tools by which full lifecycle of a call can be measured and reported. It will also provide a comprehensive set of "standardized" Call Center telephony reports.
Responsibilities as Development Lead:
- Handled Project management activities like Planning, Estimating resources, Allocation of resource, Assigning tasks, Tracking and reporting progress, Issues management and solving, Defect prevention.
- Managed a team of 5 people in the capacity of a Team Lead including People Management responsibilities like Appraisals and Mentoring
- Provided recommendations for Warehouse Implementation, Development practices, Report Distribution Methods, Performance Improvement, and Disaster Recovery.
- Worked on High Level and Detail Design documents.
- Was involved in validating high level designs to ensure accuracy and completeness against the business requirements
- Attended project meetings and communicated the project architecture and design to other team members
- Resolved technical issues faced by the team during the development phase
- Worked with the business users to gather Systems requirements and data requirements.
- Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Involved in designing the procedures for getting the data from various systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
- Used Extraction Transformation and Loading to transfer the data to the target Database.
- Wrote the shell scripts to process the Teradata procedures and wrote the Teradata program units for data extracting, transforming and loading.
- Developed various Informatica mappings and mapplets using complex transformations like Aggregators, lookups, Filters, Sequence Generator, Normalizer, Update strategy and stored procedure transformations.
- Responsible for Creating workflows and worklets - Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
- Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
- Creation of different types of reports, such as Master /Detail, Cross Tab and Chart using Business objects for the ease of UAT.
- Involved in end to end data validation by interacting with the business (UAT)
- Support during SIT, UAT and stabilization phase (moving to production)
Environment: Informatica 8.1, Oracle 10g, Teradata V2R5, Teradata Utilities (Query man, BTEQ), Red Hat Linux, Control M Scheduler, D3 Reporting, VSS 6.0, Test Director 8.0
Project: Dell On Call (DOC) March 2008- August 2009: This project consists of two main sub-components namely DOC - Dell On Call and HH-Householding. The DOC portion is geared towards providing a mechanism to monitor usage of DOC service contracts. The HH sub-component addresses concept of grouping customers based on concept of a household. The intent is to provide Dell with ability to monitor household customer behavior.
Roles and Responsibilities as Sr. ETL Developer:
- Worked with business analysts to identify, develop business requirements, transform it into technical requirements and was responsible for deliverables.
- Involved in designing, developing and testing of ETL strategy to populate data from various source system (Flat files, Oracle and SQL Server) feeds using Informatica and PL/SQL.
- Wrote Technical Specification documents for mapping builds based on functional specifications
- Extracted data from various systems in to data warehouse.
- Developed mappings as per technical specifications approved.
- Identified and tracked slowly changing dimensions, heterogeneous sources and determined hierarchies.
- Involved in data conversion from one database to Data Warehouse. Worked with other team members to load Historic data from files to Data Repository in to Production.
- Developed PL/SQL scripts for ETL and developed database procedures and functions.
- Migrated mappings from development to test environments and performed unit testing.
- Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
- Created mappings using different transformations like Source qualifier, Aggregators, lookups, Filters, Sequence, Stored Procedure and Update strategy.
- Created and used Reusable mappings to improve maintainability of mappings.
- Created various Reusable and Non-Reusable tasks like Session.
- Created and used different tasks like Decision and email tasks.
- Responsible for monitoring sessions that are running, scheduled, completed and failed.
- Used Workflow Manager to create, schedule, monitor sessions and send pre and post session emails.
- Preparation of UTP (Unit Test Plan) with all required validations and test cases.
- Responsible for writing shell scripts to schedule the jobs in development environment.
Environment: Informatica 7.2, Oracle 10g, Teradata V2R5, Teradata Utilities (Query man, BTEQ), D3 Reporting, PL/SQL, TOAD, Red Hat Linux and Windows NT, VSS 6.0, Test Director 8.0
Project: EAROPE August 2007 - February 2008: EAROPE (ETL and Report Optimization for EMEA Services) is to alter ETL process to optimize EMEA Services and Business Operations Reporting and increase internal customer experience by enabling efficient use of DATE columns within base and rollup tables.
Roles and Responsibilities:
- Analyzing Teradata database sources (Tables, Views)
- Set up background environment for job scheduling, mailing, etc in UNIX using existing templates.
- Resolved technical issues faced by team during development phase
- Attended project meetings and communicated project architecture and design to other team members
- Set up the Sand box, Sandbox Parameters and Graph Parameters
- Created temporary tables and views for intermediate calculations in Teradata
- Developing Teradata Macros to load data into Incremental/Staging tables and from staging into base tables
- Extensively used Ab-Initio ETL tool to create parameterized graphs that Extract data from diverse source systems, transform it and finally load it into target tables using components like Lookup, Reformat, Redefine format, Filter by Expression, Join, Sort, Dedup Sorted, Run SQL, Phases, Checkpoints, Multi files, Partitioning and De-partition.
- Created Ab Initio graphs & Teradata Macros to load tables sequentially
- Populating desired target tables, Unit Testing & Validation support
- Testing and tuning Ab Initio graphs and Teradata SQL's for Optimized performance
- Preparing test cases & performing unit testing and peer reviews at the end of each stage
- Developing Korn Shell scripts for wrapper automation of ETL by taking care of exception-handling
- Involved in end to end data validation by interacting with the business (UAT)
- Check in/Check out the project in/from to the EME and maintaining Version Control of source code
- Configured/Scheduled Ab Initio graphs using Control-M scheduling tool
- Support/Maintenance during the stabilization phase (moving to production)
- Performed the role of QCO by interacting with the SQA team during the CMM audits
Environment: Red Hat Linux, Ab Initio 1.13.13 Co>Op 2.13.1, PL/SQL, Teradata V2R5 and Teradata Utilities (Query man, BTEQ), Shell Scripting, Control M Scheduler, VSS 6.0, Test Director 8.0
Project: DWOA Custwatch and Service Watch September 2006- July 2007: The goal is to retire Oracle mart GCPP while offering same services (data, and report ability/features) to customer thru D3. The customers perform an additional layer of rollup to data before running their reports, so in addition to replicate current rollups owned by DDW, we will have to assess, document and replicate changes run by customers as well and develop features they require in D3 (dashboard). The source data will be coming from Teradata, loosing dependencies on Oracle marts (estrp, jirp2, dirp) and map3 rollups.
Roles and Responsibilities as ETL Developer:
- Worked on gathering requirements and business rules from business users.
- Scheduled modeling sessions to with BO consultants and project team to finalize technical design.
- Was involved in planning, development and unit testing for different modules.
- Worked on converting functional specification to technical design.
- Created both logical data models and physical data models.
- Involved in design phase, worked on SDS and mapping document (technical specifications).
- Experience with Tables, Views, Procedures and Packages in Oracle, Unix scripts, Ab Initio graphs
- Created environment in Unix for job scheduling
- Created Control-M(job scheduling software) jobs
- Created and modified PL/SQL code in Oracle, created and modified tables definitions
- Set up Sand box, Sandbox Parameters and Graph Parameters
- Validated data types between Source and Target systems
- Worked on testing and tuning Ab Initio graphs and Oracle SQL's for Optimized performance
- Worked on preparing test cases & performing unit testing and peer reviews at end of each stage
- Developed Korn Shell scripts for wrapper automation of ETL by taking care of exception-handling
- Involved in end to end data validation by interacting with the business (UAT)
- Check in/Check out project in/from to EME and maintaining Version Control of source code
- Configured/Scheduled Ab Initio graphs using Control-M scheduling tool
- Created Sample Standard and Ad hoc reports using Business Objects functionality's like Templates, Drill Down, @Functions, Cross Tab, Master Detail and Formula's etc.
- Support during SIT, UAT and stabilization phase (moving to production)
- As QCO, was responsible for process compliance and interacted with SQA team during CMM audits
- Worked on modifying project plan using Microsoft Project
Environment: Red Hat Linux, Ab Initio GDE 1.13.13, Co>Op 2.13.1, Teradata V2R5 and Teradata Utilities (Query man, BTEQ), PLSQL, Shell Scripting, Control M Scheduler, VSS 6.0, Test Director 8.0
Project: After Point Of Sales Service Entitlements (APOS SE) June 2005- August 2006: APOS SE objective is to improve existing system with ability for Dell to identify opportunities to proactively offer and sell customer warranty/service contract extensions and upgrades prior to contract expiration. Also offer ability for Dell customers to seamlessly purchase warranty/service contract extensions/renewals and upgrades after point of sale (APOS) and thereby increase revenues.
Roles and Responsibilities:
- Design and development of PL/SQL Extract Scripts, Load and Rollup using Ab Initio and Teradata
- Creation and population of user maintained & lookup tables
- Testing and tuning Ab Initio graphs for Optimized performance
- On time delivery despite highly aggressive timelines
- Preparing test cases & performing unit testing and peer reviews at end of each stage
- Involved in end to end data validation by interacting with business (UAT)
- Check in/Check out project in/from to EME and maintaining Version Control of source code
- Configured/Scheduled Ab Initio graphs using Control-M scheduling tool
- Support during Stabilization
Environment: Red Hat Linux, Ab Initio 1.11.16.2 Co>Op 2.12.2, PL/SQL, Teradata V2R5 and Teradata Utilities (Query man, BTEQ), Shell Scripting, Control M Scheduler, VSS 6.0, Test Director 8.0
Sr. ETL/Database Developer Confidential, (May 2004 - May 2005)
Project: Sanofi - Aventis Data Migration February 2005 - May 2005: Sanofi and Aventi's are two clients of Dendrite and both clients have an agreement to share data. Statistically valid data is extracted and imported basing on business rules.
Roles and Responsibilities as ETL/Database Developer:
- Involved in administrating system and database setup.
- Created stored procedures to transform data and worked extensively in PL/SQL.
- Created mappings using Data Stage with PL/SQL procedures/functions to build business rules to load data
- Involved in SQL query tuning & Database tuning
- Loaded huge variable and fixed length data files into Oracle database using SQL*Loader.
- Involved in working on logical/physical backups and exporting backup to host server, restore and recovery mgmt.
- Implementing Database level auditing, involved in writing scripts for monitoring database
- Troubleshooting performance problems.
Environment: Datastage, Windows 2000 Server, Oracle 9i, Visual Source Safe, Toad, IIS
Project: ODS Sankyo Phrama May 2004 - February 2005: ODS is an Operational data Store for AXON Data Mart of Sankyo Pharma. ODS stores data imported from various third parties extent to which it confirms with stated business rules. The statistically valid data samples extracted to Axon Data Mart from the ODS system.
Roles and Responsibilities as ETL/Database Developer:
- Involved in administrating system and database setup.
- Created Stored Procedures to transform data and worked extensively in PL/SQL.
- Created mappings with PL/SQL procedures/functions to build business rules to load data
- Involved in SQL query tuning & Database tuning
- Loaded huge variable and fixed length data files into Oracle database using SQL*Loader.
- Managed table spaces, created User Accounts and roles.
- Involved in working on logical/physical backups and exporting backup to host server, restore and recovery mgmt.
- Implementing Database level auditing, involved in writing the scripts for monitoring database
- Troubleshooting performance problems.
Environment: Informatica 6.2, Windows 2000 Server, Oracle 9i, Visual Source Safe, Toad, IIS
ETL /Database Developer Confidential, August 2002 - March 2004
Project: GEAE DQI PQ Tools PRJ B: September 2003 - March 2004: Data Quality Inspector (DQI) is an Oracle tool used by General Electrical Aircraft Engine (GEAE)for comparing quality of data in a system or between systems and to generate a report based on it. Is a framework for systematic assessment of data to quantify extent to which it conforms with stated business rules. Runs on any platform supported by Oracle database and application servers and has browser-based interface (HTML) for data quality analysts and administrators.
Roles and Responsibilities:
- Involved in analyzing the requirements and creating Technical design specifications and mapping documents
- Creating and managing stored procedures, functions and triggers during the development process
- Modified, tested and validated existing Informatica mappings.
- Worked on SQL query & Database performance tuning
- Validated data by interacting with business (UAT)
- Working on testing of the Reports.
Environment: Informatica 5.1, Windows 95/ Windows NT Server, Oracle 9i, Visual Source Safe, Toad, IIS
Project: Pack Shop System (PSS): August 2002 - September 2003: PSS is an automated system that receives parts from GE itself / vendors and packs according to design available in PSS and prints label, barcode and ships to GE customer base. It is also designed to record and update Receipt, Inventory and Packaging information (data) into local tables (PSS tables), dispatch information for parts to other interface systems.
Roles and Responsibilities:
- Involved in analyzing system and problems.
- Created stored procedures to transform data and worked extensively in PL/SQL.
- Involved in studying and testing of the Informatica mappings
- Involved in SQL query tuning & Database tuning
- Involved in end to end data validation by interacting with business (UAT)
Environment: Informatica 5.1, Windows 95/ Windows NT Server, Oracle 9i, Visual Source Safe, Toad, IIS
Rewards & Recognition
Dell Excellence Award - Best Performance Team
National Scholarship holder