Lead Etl Developer Resume
Columbus, OH
PROFESSIONAL SUMMARY:
- Nearly 8 years of Data Integration experience in integrating Teradata, DB2, Netezza, SQL Server, Oracle databases, flat files and XML for Enterprise Data Integration applications.
- Strong Knowledge on Data Warehousing trends - Ralph Kimball, Bill Inman methodologies, Star Schema, Snowflake schema, Dimensions and Facts.
- Solid exposure on developing Business Intelligence applications ODS, DM, EDW and OLAP systems.
- Designed Entity Relationships and Dimensional Modeling for OLAP systems using Erwin.
- Hands on experience on fast History Load with huge data sets using partitioning techniques.
- Implemented Slowly Changing Dimension SCD Type 1, 2, 3 techniques. Performed Incremental Load using Change Data Capture (CDC) mechanism.
- Proficient in Extraction, Transformation and Loading (ETL) using Informatica Power Center tools and having Data Stage knowledge.
- Hands on experience on dynamic files creation, mapplets, reusable transformations and tasks which avoid duplication of metadata, reduces the development time and provides centralized change control.
- Experience in constraint based target load, User Defined Functions, Command, Email, Decision, Control, Timer and Assignment tasks.
- Experience on SQL Override, Pre-SQL and Post-SQL, Pre-Session Command and Post-Session Commands, Pre-Session Variable Assignment and Post-Session Variable Assignments.
- Extensively used debugger and test load techniques to identify the bottlenecks in target load.
- Implemented Error Handling mechanism in relational databases to capture the error logs and rejected data.
- Implemented performance improvement techniques including Partition Types, Push Down Optimization, large commit intervals.
- Examined DTM processes to check Reader, Writer thread statistics and Busy Percentage in estimating pipeline execution time.
- Experience in Deploying the Informatica components through Deployment Groups in higher environments.
- Extensively used Built-In and user defined parameters/variables in adherence to the best coding standards.
- Experience in writing Analytical queries, Sub queries, Joins, DDL, DML, DCL and TCL statements, Stored Procedures, Functions, Packages, Triggers, Sequences, DB Links, Views and Materialized Views.
- Hands on experience in identifying the database performance issues using Explain Plan, Sessions and locks.
- Implemented performance improving techniques such as Partitioning, Indexing and Parallel Hints mechanism.
- Experience in Analyzing, Gathering Stats and Grooming the tables for quick retrievals of data.
- Experience in Data Quality checks such as Reconciliation, Balancing, Duplicates, Orphan records, Constraint checks and Aggregate calculations.
- Experience in FTP, archiving and compressing the data through Shell Scripts.
- Experience in writing LINUX/UNIX scripts to execute the jobs, create log files, checking storage, changing permissions, finding and modifying the files.
- Experience in writing Batch Scripts and Job Information Language (JIL) scripts to schedule the jobs through ESP and Autosys.
- Experience in controlling the versions of Code, Scripts, Files and Harvest Packages through Tortoise SVN and Software Change Manager (SCM).
- Experience in creating UTP, UTC, UTR and fixing defects through HP ALM Quality Center.
- Experience in preparing Impact Analysis, Code Reviews, Deployment Checklists, Release Tracker and Implementation Plan.
- Hands on experience in ad hoc scheduling, monitoring production jobs and fixing critical production issues.
- Experience in all phases of Software Development Life Cycle (SDLC) using Agile and Waterfall methodologies.
- Certified in General Insurance, Commercial Insurance and Annuity Products and Administration.
- Experience in leading and mentoring a small team and providing them functional and technical guidance.
- Coordinating Offshore in understanding Business Functionalities, Solution Design and Technical Deliverables.
TECHNICAL SKILLS:
Data Warehouse: Ralph Kimball and Bill Inman Methodologies, Star Schema, Snow Flake Schema, Dimensions, Facts, OLTP, ODS, DM, EDH and OLAP.
Data Model: Dimensional Model
Databases: Teradata, DB2, Netezza, Oracle and SQL Server.
Files: Fixed Width Flat Files, Delimited Flat Files and XML.
Languages: NZSQL, SQL, PLSQL, Shell Script and JIL.
TOOLS: Data Integration: Informatica Power Center Tools (8.x or 9.x)
Data Model: Erwin
Database: SQL Assistant, Squirrel SQL, AQT, TOAD, SQL Developer, SQL Plus and SQL Server Management Studio.
FTP: Putty, Ultra Edit and WinSCP
Version Control: Tortoise SVN and CA Software Change Manager
Schedulers: CICS Explorer, ESP (CA WA Workstation) and Autosys
Testing: HP ALM Quality Center
Incident Management: Service Now
WORK EXPERIENCE:
Confidential, Columbus, OH
Lead ETL Developer
Responsibilities:- Understanding Business Requirement Documents (BRD), Integration Specification Documents (ISD), Data Mappings and getting signoff from Business Team.
- Converting Integration Specification Documents into Solution Design Documents and Technical ETL Specification Documents.
- Understanding the data flows from Guidewire Policy Center (PC) to IBM Message Broker (WMB) through Message Queues and Event journalism.
- Implementing data integration framework for Outbound and Inbound flows from/to PC through ETL and WMB techniques.
- Loading and parsing Payloads through XML generators and XML parsers in Informatica for downstream systems.
- Integrating heterogeneous sources like Oracle, Fixed Width Flat Files, XML and CSV files through Informatica to develop Enterprise Data Warehouse applications.
- Sending Credit Bureau Report and Property Credit Report request and response files to Under Writing Report System (UWRS) through Informatica.
- Sending automatic email alert notifications on failed Data Quality checks through Informatica.
- Implementing Batch Framework through perl scripts which set Informatica Envionment Variables, Configure Database Connections, Trigger Informatica workflows, Update Job Status tables and create daily log files.
- Implementing Error Handling in relational databases to track each rejected record to identify the bad data and error messages.
- Preparing Data Base scripts, stored procedures, metadata scripts and reference data scripts.
- Version Controlling all ETL components including Informatica Workflows, DB scripts, Parameter files and configuration files in Tortoise SVN and CA Software Change Manager.
- Preparing Linux/Unix scripts to archive and compress files, create folders, changing permissions and modifying the parameter/configuration files.
- Handling fixed width flat files, delimited flat files, library and binary files through UltraEdit and WinScp tools.
- Preparing and Scheduling ESP Jobs, Events and applications through CA WA Workstation and CICS Explorer tools.
- Reviewing all deployable components in adherence to Checklists, Best Coding Standards and preparing release notes for higher environments.
- Creating/Updating Epics or Story cards for each Sprint/Iteration in Agile Methodology and striving to accomplish them without any delay.
- Creating priority incidents or tickets for environmental issues, DB issues and storage issues in Service Now tool.
- Leading and coordinating Offshore team in understanding Business Functionalities and implementing optimized coding techniques.
- Involving in Customer Discussions in suggesting and providing effective solutions for the Organization.
Environment: Informatica Power Center Tools 9.6.1 Hot Fix, Guidewire Policy Center, IBM Message Broker (WMB), Oracle, PL/SQL, SQL, LINUX, Flat Files, XML, Perl, Toad, Putty, Tortoise SVN, CA Software Change Manager, CICS Explorer, ESP (CA WA Workstation), HP ALM Quality Center, Ultra Edit, WinScp and Service now.
Confidential, Columbus, OH
Senior ETL Developer
Responsibilities:- Reviewing Business Requirement Documents (BRD), Functional Specification Documents (FSD), Data Mappings and getting signoff from Business Team.
- Converting Functional Specification Documents into Solution Design Documents and Technical ETL Specification Documents.
- Designing Dimensional Models for OLAP systems and generating DDL scripts through Erwin.
- Integrating DB2, Oracle and Flat File sources to develop Enterprise Data Warehouse application (Netezza, Teradata) using Informatica Power Center 9.6.1 tools.
- Dynamic Parameter Files, Flat Files, XML Files and connection creation for all downstream systems.
- Creation of Mapplets, Reusable transformations and Reusable sessions to avoid duplication of metadata, reduces the development time and provides centralized change control.
- Implementing slowly changing Dimension (SCD Type 1, Type 2) to load Dimension Tables. Performed Change Data Capture (CDC) mechanism for Incremental Load.
- Fast History Load of 20 years of transactional data using Partitioning Techniques and Netezza Bulk Writers.
- Configuring automatic Email Notifications for Reconciliation, Batch Failures and Data Quality Check errors.
- Debugging, enabling test loads, examining Reader and Writer thread statistics, DTM Processes and Buffer Loads to identify performance bottlenecks in target load.
- Implementing Error Handling mechanism in Oracle Database to log error details and rejected data.
- Deploying Informatica Components to higher environments through deployment groups.
- Creating DDL, DCL, TCL and DML scripts, Analytical Queries, Sub Queries, Stored Procedures, Joins, Views, Sequences and Triggers in Oracle and Netezza.
- Implementing Indexing, Distribution Keys, Data Base Partitioning techniques, gather stats and parallel hints mechanism to improve performance in Oracle and Netezza.
- Maintaining versions of Informatica Code, Database Scripts, Parameter files and Configuration files in SVN and Software Change Manager.
- Executing Jobs, FTP’ing, archiving and compressing the target files through Shell Scripts.
- Creating UTP, UTC, UTR and fixing the defects in HP ALM Quality center.
- Scheduling the jobs through ESP (CA WA Workstation), by creating job scripts in CICS Explorer and promoting it to higher environments.
- Creating priority incidents or tickets for environmental issues, DB issues and storage issues in Service Now tool
Environment: Informatica Power Center 9.6/9.5 Hot Fix, Erwin, Teradata, DB2, Netezza, Oracle, PL/SQL, SQL, LINUX, Flat Files, XML, AQT, Toad, Putty, Tortoise SVN, CA SCM, CICS Explorer, ESP (CA WA Workstation), HP ALM Quality Center, Ultra Edit, WinScp and Service now.
Confidential, CT
S enior ETL Developer
Responsibilities:- Analyzed, designed, developed, implemented and maintained moderate to complex initial load and incremental load mappings to provide data for enterprise data warehouse.
- Worked on Informatica Power Center tools 9.5.1-Source Analyzer, Mapping Designer, Transformation Developer, Informatica Repository Manager.
- Developed Data Mappings between source systems and warehouse components using mapping designer.
- Converted procedures into mappings using various transformations like Source Qualifier, Expression, sorter, Update Strategy, Filter, Lookup, Aggregator etc.
- Integrating policies data from various sources such as flat files, DB2 and Oracle into Teradata.
- Altered mappings to accommodate new business requirements due to single instance impact at the source.
- Created workflows using Session, Command, Decision, Timer and Email tasks.
- Involved in writing, Defining, Designing, modifying, and testing procedure using SQL.
- Used Autosys scheduler to schedule ETL jobs and monitor their progress.
- Implemented Change Data Capture(CDC) mechanism to load Claims data.
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions.
- Created project iterations using Agile development technique to complete deliverables.
- Tuned SQL queries for faster data retrieval. Also tuned Informatica mappings for better performance.
- Solely responsible for handling daily loads and reject data.
- Created procedures to drop and recreate the indexes in the target Data Warehouse before and after the sessions.
- Used HP ALM to view defects assigned by testers and update the status of the defect after resolving the issues.
- Migrated mappings to various environments like development, testing and production.
- Created Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.
Environment: Informatica Power Center 9.5.1, Teradata, DB2, Oracle, PL/SQL, SQL, SQL Loader, LINUX, Flat Files, Toad, Putty, Tortoise SVN, Autosys, HP ALM Quality Center and Ultra Edit.
Confidential, NY
ETL Developer
Responsibilities:- As per ETL specification document, implemented source to target mappings using various transformations such as Look Up, Joiner, Source Qualifier, Aggregator, Expression, Sequence Generator, Normalizer, Router Transformations.
- Implemented Slowly Changing Dimension (SCD Type2) methodology to access the full history of accounts and transaction information.
- Used built-in mapping variables/parameters and created parameter files for imparting flexible runs of sessions / mappings based on changing variable values and used this for CDC.
- Created mapplets and Reusable Transformations to reduce cache size and reduce effort for repetitive tasks.
- Error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Experience with restoring the repository service after backup using Informatica admin console
- Promoted database scripts to test and production environments.
- Wrote Test cases based on Source to Target Mapping documents and performed unit testing& SIT.
- Loaded operational data from SQL Server, Oracle, flat files, Excel Worksheets into various data marts.
Environment: Informatica Power Center 8.6/9.1, Oracle, SQL Server, PL/SQL, SQL Server Management Studio, SQL Developer, LINUX, Flat Files, Toad, Putty, Telnet, Tortoise SVN, Autosys and HP ALM Quality Center.
Confidential
ETL Consultant
Responsibilities:- Participated in requirements gathering sessions with Business Analysts to understand business needs.
- Created ODBC connections to import objects from various databases such as Oracle, flat files.
- Developed mappings, sessions and workflows using Mapping Designer and Workflow Manager to perform data transformation and data loading into Oracle database.
- Provided production support for daily job runs and fixed the issues by identifying cause of failure of failed jobs.
- Implemented slowly changing dimensions SCD Type 1, Type 2 and CDC to update current information and maintain history in dimension tables.
- Worked with Quality Assurance team to build test cases to perform unit, Integration, functional and performance Testing.
- Wrote Pre and Post SQL commands in session properties to manage constraints that improved performance and wrote SQL queries to perform database operations according to business requirements.
- Migrated mappings and workflows from Development to Test and to Production Servers.
- Performed Version Control to check in, checked out versions of objects used in creating the mappings and workflows to keep track of the changes in the development, test and production environment.
Environment: Informatica 8.6, Oracle, UNIX, SQL Plus, TelNet, Putty, Flat Files and HP ALM.