Sr. Etl Developer/ Data Specialist Resume
Livonia, MI
SUMMARY:
- 9+ years of Experience in Data warehouse using ETL Informatica power center 9.x, 9.0.1/8. 6/8.1. 1/8. 0/7 .1.2/7.1.1, ETL OLAP and OLTP
- 3 Years of Experience as Business Analyst for Data Warehouse in Analyzing the project, Gathering all the Project Requirements, Forecasting the Deliverables, Analyzing the data, Leading the projects under the assistance of PMO.
- Experience in dealing with various Data sources like Oracle 11/10g/9i/8i, Oracle PL/SQL, T - SQL, SQL Server 2010/2008, Teradata, Ms-Access, MS - Excel and Flat files, web services, XML, COBOL / VSAM files etc.
- Dimensional Data Modeling experience on Data modeling, ERwin 4.5/4.0/3.5.5/3.5.2, Dimensional Modeling, Ralph Kimball Approach, Star Modeling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling, Star and Snowflake Schemas.
- Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De normalization Concepts.
- Experince in creating Mappings and Mapplets using various connected and unconnected transformations like Source Qualifier, Aggregator, Expression, Lookup, Filter, Joiner, Union, Router, Rank, Sequence Generator, SQL, HTTP transformations, Transaction control and upated strategy transformations.
- Proficient in using Informatica Designer, workflow manager, workflow monitor to create, schedule and control workflows, tasks, and sessions.
- Worked exclusively in implementing the types of slowly changing dimensions (SCDs) - Type I and II in different mappings as per the requirements.
- Experience working on performance tuning in Informatica using Pipeline partitioning and Push down optimization.
- Experience in preparing the ETL Design Documents like High Level and low level Design documents. Involved in the Unit testing, integration testing, System testing and UAT using Quality Center.
- More than a year of experience on Teradata platform using Teradata utilities such as Teradata sql assistant, load utilities such as Teradata parallel transponder, FastLoad, FastExport.
- Around 1 year of experience with Change Data Capture using Informatica PowerExchange.
- Experience in using Informatica Power Exchange Navigator to extract data from sources.
- Around One year of experience in the common integration framework for Informatica architecture.
- Good experience working with Informatica Data Quality for Address Validations.
- Good exposure to development, testing, debugging, implementation, documentation, end user training and production support.
- Good experience writing Technical specification documents, ETL Design documents, Test plans and Deployment plans.
- Aptitude for analyzing, identifying problems and coming with out of the box solutions.
- Ability to achieve organizational integration, assimilate job requirements, employ new ideas, concepts, methods, and technologies.
- Excellent communication, interpersonal and analytical skills.
- Self-motivated, quick learner and adaptive to new and challenging technological environments.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.6.x/9.1/8.6.x/8.5.x., Informatica Power Exchange.
Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.
Databases: Oracle11g/10g/9i, SQL Server 2008/2005/2000, IBM DB2, Teradata 13.1/V2R5, V2R6,MS Access
Scheduling Tools: Control-M, CA7 Scheduler, Autosys, Informatica Scheduler.
Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).
Programming: SQL, PL/SQL, Transact SQL, HTML, DHTML, XML, C, C++, Shell
Operating Systems: Windows 7/XP/NT/95/98/2000, UNIX and LINUX
Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.
PROFESSIONAL EXPERIENCE:
Confidential, Livonia, MI
Sr. ETL Developer/ Data Specialist
Technical Environment: Informatica Powercenter 9.6.1, Oracle 11g, Sql Server 2012, flat files, XML files, Informatica IDQ, B2B data transformation, Informatica ICloud, Informatica Address Doctor, Batch Scripting, Microstratergy.
Responsibilities:
- Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
- Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
- Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
- Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
- Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.
- Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
- Developed mappings to load data into landing layer, staging layer and publish layer with extensive usage of SCD Type I and SCD Type II development concept.
- Experience development of SCD Type I and Type II with the help of MD5 hash function.
- Experience working with B2B data transformation concept of Informatica.
- Experience working with advanced Informatica transformation like unstructured data transformation for parsing HL7 data file.
- Experience working in Informatica Data Quality to create a mapplet for validating, cleasing address’s using Address Validator transformation.
- Exporting the Mapplets from IDQ into Informatica Powercenter to use the mapplet in various mappings for implementation of Address doctor.
- Hands on experience working on profiling data using IDQ.
- Experience working with extracting and loading data directly into Salesforce objects using Informatica Powercenter.
- Experience working with various session properties to extract data from Salesforce object using standard api, Bulk api.
- Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
- Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
Sr. Informatica Developer/Data Specialist
Technical Environment: Informatica Power Center 9.1, Power Exchange, PL/SQL, Oracle 11g, SQL Server 2005/2000, Windows Server 2003, UNIX, Control-M, Toad
Responsibilities:
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
- Involved in creating Logical and Physical Data Models and creating Star Schema models.
- Assisted in designing Star Schema to design dimension and fact tables.
- Involved in developing star schema model for target database using ERWIN Data modeling.
- Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
- Created Mapplet and used them in different Mappings.
- Used sorter transformation and newly changed dynamic lookup.
- Created events and tasks in the work flows using workflow manager.
- Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
- Created Schema objects like Indexes, Views and Sequences.
- Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
- Created Stored Procedures, Functions and Indexes Using Oracle.
- Worked on batch processing and file transfer protocols.
- Performance tuning of the workflow which are taking time.
- Analyzed the enhancements coming from the individual Client for application and implemented the same.
- Creation of technical documents.
- Developed mappings for policy, claims dimension tables.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.
Snr. ETL Informatica Developer/Specialist
Technical Environment: Informatica Power Center 9.1, Informatica Power Exchange, Oracle 10g, PL/SQL, SQL Server, DB2, Teradata 13.10, Clear Case, Erwin, Business Objects Info view, Windows, UNIX
Responsibilities:
- Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Involved in design phase of logical and physical data model using Erwin 4.0.
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer .
- Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files.
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Developed mappings to load Fact and Dimension tables, SCD Type I and SCD Type II dimensions and Incremental loading and unit tested the mappings.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, Fast Exports, Multi loads and Fast Load scripts, Reviewed Unit Test Plans and System Test cases.
- Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
- Created BTEQ scripts to extract data from EDW to the Business reporting layer.
- Developed BTEQ scripts for validation and testing of the sessions, data integrity between source and target databases and for report generation.
- Loaded data from various data sources into Teradata production and development warehouse using BTEQ, FastExport, multi load and FastLoad.
- Used Teradata Administrator and Teradata Manager Tools for monitoring and control the systems.
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Involved in creating different types of reports including OLAP, Drill Down and Summary in BO.
- Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
Informatica ETL Prod Support
Technical Environment: Informatica Power Center 8.6.1/8.1.1, Oracle 10g, TOAD 10.1 for Oracle, DB2, Flat Files, PL/SQL, OBIEE 11g, ERWIN, Windows 2000, UNIX PERL scripting, Autosys
Responsibilities:
- Designed, developed and documented the ETL (Extract, Transformation and Load) strategy to populate the Data Mart from the various source systems.
- Worked on ly Informatica 8.6.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Involved in design and development of complex ETL mappings.
- Implemented partitioning and bulk loads for loading large volume of data.
- Based on the requirements, used various transformations like Source Qualifier, Normalize, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
- Developed Mapplets, Worklets and Reusable Transformations for reusability.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, Mappings, transformations and sessions to optimize session performance.
- Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
- Performance tuning by session partitions, dynamic cache memory and index cache.
- Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
- Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
- Created Stored Procedures in PL/SQL.
- Created UNIX Shell scripts to automate the process.
- Developed Documentation for all the routines (Mappings, Sessions and Workflows).
- Involved in scheduling the workflows through Autosys Job scheduler using UNIX scripts.
- Played a key role in all the testing phases and responsible for production support as well.
Informatica Developer
Technical Environment: Informatica Power Center 8.5.x, ETL, Oracle 9i, SQL Server2000, MS Access, SQL, PL/SQL, Windows NT
Responsibilities:
- Interacted with business analysts, data architects, application developers to develop a data model.
- Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica Power Center 8.5.x tool.
- Created sessions, database connections and batches using Informatica Server Manager/Work flow Manager.
- Optimized mappings, sessions/tasks, source and target databases as part of the performance tuning.
- Designed ETL Mappings and Workflows using power center designer and workflow manager.
- Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to target tables.
- Developed various Mappings, Mapplets and Transformations for data marts and Data warehouse.
- Used Shell Scripting to automate the loading process.
- Used VBA Macro excels to compare data to show proof of concept for Unit testing.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Analyzed and Created Facts and Dimension Tables.
- Used Informatica features to implement Type I and II changes in slowly changing dimension tables.
- Used command line program pm cmd to communicate with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
- Designed processes to extract, transform and load data to the Data Mart.
- Performed impact analysis, remedial actions and documented on process, application failures related with Informatica ETL and Power Analyzer.
- Performed regular backup of data warehouse, backup of various production, development repositories including automating and scheduling processes. As part of optimization process, performed design changes in Informatica mappings, transformations, sessions.