Sr.etl Developer Resume
CT
SUMMARY
- Oracle Certified ETL Professional wif Over 9 years of IT experience in analysis, design, and development of Data warehousing applications.
- Over 8 years of experience in Data Warehouse/Data Mart development using ETL/ Informatica Power Center, IDQ, SOAP, XML, PL/SQL.
- Experience working in ETL methodologies for supporting data extraction, transformations and loading processing using Informatica Power Center.
- Worked on multiple client specific environments related to Health Care, Telecom, Financials and Retail
- Hands on experience on Create, update and maintain project documents including business requirements, functional and non - functional requirements, functional design, data mapping.
- Well acquainted knowledge in working wif Informatica Power Center 10.0.0/9.x/8.x/7.x client tools (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor).
- Strong understanding of teh Data modeling (Dimensional & Relational) concepts like Star-Schema and Snowflake Schema.
- Experience working wif various Heterogeneous Source Systems like Oracle 11g/10g/9i/8i, SQL, PL/SQL, MS SQL Server 2008/2005, Flat files, Netezza, Teradata, Salesforce and Legacy systems.
- Experienced in Data Integration.
- Expert in implementing business rules by creating re-usable transformations, mapplets mappings
- Extensive experience in using various Informatica Designer Tools like Source Analyzer, Transformation Developer, Mapping Designer, Mapplets Designer, Schedulers and Warehouse Designer.
- Extensively created Mapplets, common functions, reusable transformations, look-ups for better usability. worked in designing and developing complex Mappings using transformations like Source qualifier, Aggregator, Expression, Connected & Unconnected lookup, SAP BAPI/IDOC’s, Filter, Joiner, Sequence generator, Sorter, Router, Normalizer and Update Strategy.
- Experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplets, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.
- Experience in performing De-normalization, Cleansing, Conversion, Aggregation, Performance Optimization process.
- Involved in implementing in Slowly Changing Dimensions (SCD)) using Informatica Power Exchange.
- Good understanding of views, Synonyms, Indexes, Joins and Sub-Queries. Extensively used Cursors and Ref Cursors.
- Expertise in designing and developing Test plans, Test Cases / Test Scenarios, Test Reports for both manual and automated tests and ETL testing on validating teh data between source and target tables by writing complex SQL queries.
- Extensive experience in Tuning and scaling teh procedures for better performance by running explain plan and using different approaches like hint and bulk load.
- Worked extensively on DDL scripts
- Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
- Great Expertise in using Exception Handling strategies to capture errors and referential integrity constraints of records during loading processes to notify teh exception records to teh source team.
- Experience in UNIX shell scripting and in Scheduling tools like Autosys, Jams, Control-M
- Hands on Experience in Cognos Reporting Tool.
- Created Drill down Hierarchies and implemented Business Logics and Facts.
- Delivered all teh projects/assignments wifin specified timelines.
- Self-starter wif drive, initiative and a positive attitude and Strong ability to work wifin a demanding and aggressive project schedules and environments.
- Excellent analytical, problem solving skills and a motivated team player wif excellent inter-personal skills.
- Good knowledge on Tracking tool like JIRA.
- Knowledge of Erwin for Data Modeling.
- Strong experience to teh entire Software Development Life Cycle (SDLC) and Agile Methodology.
- Strong trouble-shooting, problem solving, analytical and design skills.
- Effective communication, professional attitude, strong work ethic and interpersonal skills.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter 10.0.0/9.x/8.x/7.x, (Source Analyzer, Warehouse Designer, Transformation Developer, Data Integration, Mapplet, Designer, Mapping Designer, Repository Manager, Workflow Manager, Workflow Monitor and Informatica Server) ETL, Repository, Metadata, DataMart, FACT & Dimensions tables, Physical & logical data modeling
Reporting Tools: Business Objects, Cognos 8.4 (Awareness)
Data Modeling: Physical and Logical Data Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes, Cardinality, ER Diagrams, Erwin 7.0/4.1/5.1
Databases: Oracle 10g/9i/8.0, MS SQL Server 7.0/2000/2005, OLTP
Programming: SQL, PL/SQL, SQL*Loader, Unix Shell Scripting, SQL Tuning/Optimization
Other Tools: Netezza, SQL- Developer,SAP Teradata, DB2,Salesforce, Informatica Cloud, Autosys, Secure FTP (SSH), TOAD, Excel, Word, MS-Office.
Environment: UNIX, Windows 2000/NT, Win XP
PROFESSIONAL EXPERIENCE
Confidential, CT
Sr.ETL Developer
Responsibilities:
- Worked closely wif business analysts to gather teh requirements.
- Having extensive experience wif different methodologies Agile and waterfall.
- Possess knowledge of CRM processes in Healthcare like Marketing, Business, Patient, Provider, Call center processes
- Reviewed teh developed Functional specs, suggested modifications after source analysis.
- Worked wif team to identify teh target tables, functional modules and data sources.
- Worked on Data Integration from different source systems.
- Created Simple to complex Quires involving different joins, Subqueries, Analytical Functions.
- Created database Objects like Tables, views, sequences, synonyms, Index using Oracle tools like TOAD, SQL Developer.
- Tuned and optimized quires by indexing strategies and analyzing query options
- Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract teh Data from multiple source systems and loaded into Preload staging tables in Oracle.
- Extensively Used Tuned SQL Overrides in Source Qualifier for better performance.
- Designed and developed mappings using various transformations such as source qualifier, expression, connected and un-connected lookup, router, aggregator, filter, sequence generator, update strategy, normalizer, joiner and rank transformations in Power Center Designer.
- Wrote SQL-Overrides and used filter conditions in source qualifier there by improving teh performance of teh mapping.
- Used Parameter Files, mapping parameters, mapping variables and session parameters to pass teh values.
- Extracted files from different source systems like Flat files, Oracle, Netezza and Salesforce (SFDC)
- Hands on experience in creating Custom Objects, Custom Fields, Page Layouts and various other components as per client application Requirements.
- Implemented Salesforce Merge Transformation to merge duplicates in Salesforce Platform.
- Implemented Address Doctor in mappings to standardize teh address format.
- Hands on Experience in IDQ, involved in using Match merge process. Merged teh duplicates records from two different applications.
- Created mapplets, worklets and reusable transformations dat provide reusability in mappings.
- Used Informatica debugger in finding out bugs in existing mappings by analyzing data flow and evaluating transformations and done unit testing for teh individual developed mappings.
- Implemented Slowly Changing Dimensions Type 2.
- Loaded teh data to Salesforce (SFDC) and Netezza using Informatica.
- Experience in Informatica Cloud and workbench.
- Created various tasks like email, Scheduler and command task.
- Scheduled jobs using JAMS Scheduler tool.
- Created UNIX shell scripts for data validations and archival of log files.
- Experience in writing shell scripts to extract load and migrate data to or from Netezza using NZSQL, NZLOAD and NZ MIGRATE utilities.
- Performed end to end data validation tests.
- Involved identifying Bottlenecks and tuning to improve teh performance issues.
- Experience in Performance Tuning using Explain Plan and Hints.
- Experience in Concurrent loading to reduce teh ETL Load Time.
- Tested Target data against source system data.
- Worked On JIRA for Projecting tracking and creating sprints and project update.
- Worked on production tickets and resolved issues wif data.
- Created technical design documents for teh developed Informatica coding.
- Created detailed unit test documents wif all possible Test cases/Scripts.
Environment: Informatica Power Center 9.5.1, Informatica Power Center 10.0.0, Oracle 10g/11g, Flat files, Netezza, Salesforce, PL/SQL, SQL developer, Erwin, UNIX Shell Script, Windows XP.
Confidential, MA
ETL Developer
Responsibilities:
- Worked wif end users to understand teh business requirements, designing teh work flow, implementing and supporting on ongoing basis.
- Involved in analysis of source systems and identification of business rules.
- Responsible for developing, supporting and maintenance for teh ETL (Extract, Transform and Load).
- Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
- Involved in Dimensional Modeling using Star schema and Snow Flake schema for faster, effective query processing and Business Analysis Requirements.
- Loaded data from OLTP Tables (which are in MS SQL Server) to Staging table (ORACLE database and DB2) and Staging table to OLAP Tables (ORACLE tables).
- Extensively used ETL to load Flat files, Oracle, XML files and legacy data as sources and Oracle as targets.
- Created Informatica mappings using various transformation like SAP (BAPI/RFC), SAP IDOC’s transformations and web services.
- Loaded data from Legacy systems to Staging table using Informatica Power Connect
- Used Informatica power connect to connect external Data bases.
- Developed Mappings to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter and Sorter transformations.
- Worked on Data Extraction, Data Transformations, Data Profiling, Data Loading, Data Conversions and Data Analysis.
- Worked wif Informatica Power center work flow manager to create sessions, work flows and work-lets.
- Worked on Informatica to load data from Flat files to Oracle
- Created Workflows and monitored Sessions to execute Informatica mappings
- Used Debugger wifin teh Mapping Designer to test thedata flowbetweensource and targetand to troubleshoot teh invalid mappings.
- Efficient in writing SQL Queries
- Written PL/SQL Procedures, Function, Package & UNIX Shell Scripts to perform database operations and pre & post session commands.
- Created complex mappings and transformations.
- Worked on SQL tools like TOAD and SQL Developer to run SQL Queries and validate teh data.
- Experience in creating DDL scripts.
- Schedule Informatica Jobs through Autosys scheduling Tool.
- Developed many reports/dashboards Analytic views (Pivot Tables, Charts, Column Selector, and View Selector).
- Fine-tuned Informatica jobs by optimizing all transformations.
- Assisted QA team to fix and find solutions for teh production issues
- Prepared all documents necessary for knowledge transfer such as ETL strategy, ETL development standards, ETL processes, etc.
- Performed unit testing and performance tuning for transformation and mapping.
- Prepared Weekly Status Reports (PSR) and shared wif teh Managers
- Production Support which involves solving user problems and Provided End User Training and Support
- Prepared User documentation to train users and functionality of each report
Environment: Informatica Power Center 8.6.1, Informatica Power Exchange, Informatica power connect suite, Oracle 10g, DB2, Flat files, MS-SQL Server 2005, PL/SQL, SQL developer, Business Objects, Erwin, UNIX Shell Script
Confidential, IL
ETL Developer
Responsibilities:
- Involved in Dimensional modeling (Star Schema) of teh Data warehouse and used Erwin to design teh business process, grain, dimensions and measured facts.
- Extracted teh data from teh flat files and other RDBMS databases into staging area andpopulated onto Data Warehouse.
- Loaded data from Legacy systems to Staging table and to OLAP Tables (ORACLE tables) using Informatica Power Connect.
- Engaged in Data integration to support data requirements.
- Developed number of complex Informatica mappings, Mapplets, reusable transformation to implement teh business logic and to load teh data incrementally.
- Developed Informatica mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations
- Used Debugger to test teh mappings and fixed teh bugs.
- Used various transformations like Filter, Expression, Sequence Generator, Update
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Power Center server manager/Workflow manager for session management, database connection management and scheduling of jobs to be run in teh batch process using Control M auto scheduling tool.
- Migrated mappings, sessions, and workflows from Development to testing and tan to Production environments.
- Created multiple Type 2 mappings in teh Customer mart for both Dimension as well as Fact tables, implementing both date based and flag based versioning logic.
- Monitor troubleshoots batches and sessions for weekly and Monthly extracts from various data sources across all platforms to teh target database.
- Given Production support to resolve teh issues.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Tested teh data and data integrity among various sources and targets. Associated wif Production support team in various performance related issues.
- Developed UNIX shell scripts to move source files to archive directory.
- Used Informatica power connect to connect external Data bases.
- Involved in Unit, Integration, system, and performance testing levels
- Extensively used SQL* loader to load data from flat files to teh database tables in Oracle.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Created various data marts from data warehouse and generated reports using Cognos
- Developed Standard Reports, List Reports, Cross-tab Reports, Charts, Drill through Reports and Master Detail Reports Using Report Studio.
- Created Query prompts, Calculations, Conditions, Filters, Multilingual Reports Using Report Studio.
Environment: Informatica Power Center suite 8.1, Informatica power connect suite, Oracle10g, SQL Server, SQL, PL/SQL, SQL* Loader, Erwin, TOAD 9.5, Star Schema, UNIX Shell Scripts, Cognos8.x, Flat files, Windows XP, and MS-Office tools.
Confidential, NY
ETL Developer
Responsibilities:
- Extensively used Informatica Power Center for Extraction, Transformation and Loading process.
- Worked on Informatica tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Used Informatica power connect to connect external Data bases.
- Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse wif Business Rules.
- Designed and developed various Mappings and Mapplets in Mapping designer and sessions and workflows in Workflow manager to extract data from SQL Server and load to Oracle database.
- Extracted teh source definitions from various relational sources like Oracle and Teradata and Flat Files and Legacy systems using Informatica Power Connect
- Developed transformation logic for different mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator
- Developed reusable transformations, Mapplets and Worklets and utilized mapping parameters, session parameters, user defined functions to optimize teh performance.
- Worked on different transformations like Source Qualifier, Joiner, Filter, Update Strategy, Lookup, Rank, Expression, Aggregator and Sequence Generator to load data into target database.
- Responsible for Data Staging Development and Deployment.
- Handled Slowly Changing Dimensions of Type1/ Type 2 to populate current and historical data to dimensions and fact tables in teh Data Warehouse.
- Extensively used Stored Procedures, Functions, Triggers and Packages using PL/SQL for creating Connected and Unconnected Stored Procedure Transformations.
- Created workflows using Workflow manager for different tasks like sending email notifications, timer dat triggers when an event occurs, and sessions to run a mapping.
- Utilized Informatica debugger for trouble-shooting and to test teh data flow.
- Reviewed session log files to trace causes of bottlenecks and optimized them by tuning teh transformation logic in mappings and SQL queries involved.
- Developed and Implemented UNIX shell scripts and scheduled different Informatica jobs.
- Created UNIX shell scripts and called as pre-session and post-session commands.
- Involved in Unit Testing and Integration Testing of teh application.
- Documented teh entire process. Teh documents included teh mapping document, unit testing document and system testing document among other
Environment: Informatica Power Center 8.1, Informatica power connect suite, Oracle 10g, Teradata, Flat files, UNIX Shell Scripting, PL/SQL, SQL Server, Toad, Control-M
Confidential
ETL Developer
Responsibilities:
- Designing of Job Flows.
- Developing Jobs, which will pull teh data from source tables and process it and finally insert into teh target tables
- Involved in developing teh PL/SQL procedures, which are part of teh horizon job.
- Integration of Job Flows to make teh job streams.
- Involved in teh development of Data Mapping and Design Process documents for teh ETL processes.
- Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
- Involved in Dimensional Modeling using Star schema and Snow Flake schema for faster, effective query processing and Business Analysis Requirements.
- Used SQL tools like TOAD to run SQL queries and validate teh data in warehouse.
- Extended logic for some of teh existing mappings in teh system to meet enhancement procedures.
- Implemented Slowly Changing Dimensions - Type me and Type II in various mappings.
- Efficient in writing SQL Queries
- Worked extensively on DDL scripts
- Written PL/SQL Procedures, Function, Package & UNIX Shell Scripts to perform database operations.
- Validation and Testing of teh ETL work products.
- Done Unit Testing for teh developed jobs.
- Creation of database objects like Tables, Views, Procedures, Functions and Triggers.
- Optimized teh queries to improve teh performance of teh application.
- Created Batch Jobs and UNIX Scripts to automate teh process.
- Worked on Production support most of teh time to resolve Issues quickly. Used Informatica debugger to trace out Issues in mappings.
- Created documentation to trace out new design techniques easily. Co-ordinate wif offshore team wif tasks assigning, requirement sharing.
Environment: ETL, PL/SQL, DDL Scripts, Linux, Toad, Oracle 10g/9i, Unix Korn Shell Scripting, Windows XP Professional, Windows XP/2000