We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

Livonia, MI

SUMMARY:

  • 8 years of experience in Information Technology with a strong background in Database development and Data warehousing.
  • Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Developer (IDQ), Informatica Power Exchange.
  • Good understanding about design, architecture and implementation of Informatica MDM project
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
  • Experience working with Informatica Data Quality tool to effectively use Address validator transformation for Address doctor implementation.
  • Worked in different phases of the projects involving Requirements Gathering and Analysis, Design, Development, Testing, Deployment and Support.
  • Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
  • Worked with wide variety of sources like Relational Databases, Flat Files, XML files, Mainframes, Salesforce Objects, Unstructured data files and Scheduling tools like CA7, Control-M and Informatica Scheduler.
  • Experience working with various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
  • Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
  • Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes), UNIX Shell scripting and Windows Batch scripting.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Experience working in Production Support, Support for Emergency fix and migrating fix from lower environment to higher environment as per the policies.
  • Very good exposure to Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005 .
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5.1/9.1/8.6. x, Informatica Power Exchange 9.5.1, Informatica developer 9.6.1, Informatica ICloud

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server 2012/2008/2005 , DB2, Teradata

Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, Control M.

Reporting Tools: Microstratergy, and Hyperion Essbase.

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, Java, Shell, Perl, Batch

Operating Systems: Windows 7/XP/NT/2000, DOS, UNIX and LINUX

Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office.

PROFESSIONAL EXPERIENCE:

Confidential

Livonia, MI

Sr. ETL Informatica Developer

Technical Environment: Informatica 9.5.1, Oracle 11g, Sql Server 2008, Power Exchange 9.5.1, flat files, Mainframe, Teradata,Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX, Perl Scripting and CA7 Scheduler

Responsibilities:

  • Good Understanding of business requirements, technical specifications, source repositories and physical data models for project activities.
  • Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, HL-7, EPIC, XML and Flat Files.
  • Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
  • Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.
  • Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs
  • Developed mappings to load data into landing layer, staging layer and publish layer with extensive usage of SCD Type I and SCD Type II development concept.
  • Experience development of SCD Type I and Type II with the help of MD5 hash function.
  • Experience working with B2B data transformation concept of Informatica.
  • Experience working with advanced Informatica transformation like unstructured data transformation for parsing HL7 data file.
  • Experience working in Informatica Data Quality to create a mapplet for validating, cleasing address’s using Address Validator transformation.
  • Exporting the Mapplets from IDQ into Informatica Powercenter to use the mapplet in various mappings for implementation of Address doctor.
  • Hands on experience working on profiling data using IDQ.
  • Experience working with extracting and loading data directly into Salesforce objects using Informatica Powercenter.
  • Experience working with various session properties to extract data from Salesforce object using standard api and Bulk api.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer , Mapplet Designer and Worklet Designer.
  • Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
  • Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.

Confidential

Cleveland, OH

ETL Technical Onsite Lead

Technical Environment: Informatica 9.5.1, Oracle 11g, Sql Server 2008, Power Exchange 9.5.1, flat files, Mainframe, Teradata,Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX, Perl Scripting and CA7 Scheduler

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
  • Develop an ETL Informatica mappings in order to load data into staging area. Extracted data from Mainframe files, flat files, Sql Server and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Write Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Developed workflows for dimension loads, fact loads based on daily/monthly runs.
  • Developed code to archive monthly data into history tables and effective use of the history table to load the data back into the system for a particular past month.
  • Developed Audit tables to keep the track of ETL Metrics for each individual run.
  • Experience working with Audit Balance control concept to create the parameter files dynamically for each workflow before its run.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect NDM process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.

Confidential

Farmington Hills, MI

ETL Informatica developer

Technical Environment: Informatica Power Center 9.1/8.6.x, Power Exchange, PL/SQL, Oracle 10g/9i, SQL Server 2008/2005, Windows Server 2003, UNIX, Tivoli Scheduler, Toad

Responsibilities:

  • Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
  • Involved in creating Logical and Physical Data Models and creating Star Schema models.
  • Assisted in designing Star Schema to design dimension and fact tables.
  • Involved in developing star schema model for target database using ERWIN Data modeling.
  • Worked on Informatica PowerCenter - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer .
  • Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files
  • Created Mapplet and used them in different Mappings.
  • Used sorter transformation and newly changed dynamic lookup.
  • Created events and tasks in the work flows using workflow manager.
  • Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
  • Created Schema objects like Indexes, Views and Sequences.
  • Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
  • Created Stored Procedures, Functions and Indexes Using Oracle.
  • Worked on batch processing and file transfer protocols.
  • Performance tuning of the workflow which are taking time.
  • Analyzed the enhancements coming from the individual Client for application and implemented the same.
  • Creation of technical documents.
  • Developed mappings for policy, claims dimension tables.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.
 

Confidential

ETL Informatica developer

Technical Environment: Informatica Power Center 8.x, PL/SQL, Oracle 9i, SQL Server 2008/2005, UNIX, Control-M, Toad, Putty

Responsibilities:

  • Study and analyze mapping document, the required source tables, data types, required transformations based on Business requirements and Technical specifications.
  • Using Informatica Power Center, extracting data from SQL Server, Oracle, XML files and Flat Files.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
  • Expertise in Developing Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings and Workflows and Backup, Recovery.
  • Used SQL over rides in Source Qualifier and Lookups to improve mapping performance.
  • Developing Reusable Transformations, Aggregations and created Target Mappings that contain business rules.
  • Extensively worked with various Passive transformations in Informatica Power Center like Expression Transformation, Sequence Generator, and Lookup Transformation.
  • Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation.
  • Created and Configured Workflows, Work lets and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Created Connected and Unconnected lookup transformation for better performance of the mappings and sessions.
  • Developed simple & complex mappings using Informatica to load Dimension & Fact tables as per STAR Schema techniques.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Created Materialized views for summary tables for better query performance.
  • Developed PL/SQL stored procedures and UNIX shell scripts for pre and post session commands, batch jobs.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows. 
  • Providing extensive changes as a part of User acceptance testing and deployment of mappings at client place.
  • Performed extensive testing on the mappings and wrote queries in SQL to check if the data was loading to the dimension tables and fact tables properly.

Hire Now