We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY:

  • Over 8 Years of experience in Information Technology including Data Warehouse/Data Mart development using Informatica Power Center across various industries such as Telecom, Banking, Healthcare and Retail.
  • Expertise in implementing complex Business rules by creating mappings, mapplets, and reusable transformations using Informatica Power Center 9.x/8.x/7. x.
  • Worked with different source and target systems like Oracle, Teradata, DB2, MySQL, SQL Server, Sybase, Flat Files and XML in DWH.
  • Experience in working with most of the transformations like Expression, Router, Data Masking, Joiner, Connected and Unconnected lookups, Filter, Aggregator, Update, Rank, Sorter and Sequence Generator.
  • Strong in SQL and PL/SQL and Extensive hands on experience in doing performance tuning of Database queries.
  • Good understanding of Relational and Dimensional Data Modeling - Star & Snow Flake schema, De-normalization, Normalization.
  • Proficiency in data warehousing techniques like data cleansing, Slowly Changing Dimension (SCD) phenomenon, Surrogate key assignment, change data capture (CDC).
  • Experience in doing Unit Testing, working with QA team for system testing.
  • Involved in doing regression testing for Informatica and database upgrades.
  • Experience in Debugging & Error Handling of the Informatica code.
  • Good experience with Informatica Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in writing UNIX shell scripts and enhancing the existing ones.
  • Experience in using Teradata Utilities - BTEQ, M-Load, F-Load & TPT.
  • Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
  • Worked with scheduling tools like Autosys, ControlM, Maestro & Informatica scheduler.
  • Experience working with Business Analysts and Data Modelers in understanding the business requirements, Physical and logical data models.
  • Experience in working under Waterfall & Agile Methodologies for DWH Implementation.
  • Experience in 24*7 on call rotation production support using different Ticketing systems.
  • Excellent interpersonal and communication skills, and experienced in working with senior level managers, business users and developers across multiple disciplines.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5.1 / 9.1.1/8.6.1 /8.1.1.

Databases: Oracle 12C/11g/10g/9i/8i, Teradata 14/12, MS SQL Server 2012/2008, My SQL, DB2, Sybase.

Data Modeling: Dimensional Modeling, ER Modeling, Ralph Kimball Methodology, Bill Inman Methodology, Star, Snow-Flake, Fact Tables, Dimension Tables, Physical and Logical Modeling, Normalization and De Normalization.

Job Scheduling: Autosys, Control M, Cron Job and Maestro, Informatica Scheduler.

Others: Toad, SQL Navigator, Sql Assistant, Putty.

Environment: Windows, Linux, Unix.

PROFESSIONAL EXPERIENCE:

Confidential, Dallas, TX

Sr. ETL Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.
  • Extensively used Informatica Client tools like Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
  • Used Teradata utilities Fast Load, Multi Load, tpump to load data.
  • Created Sources, Targets in shared folder and developed re-usable transformations, mapplets and user defined function (UDF) to re-use these objects in mappings.
  • Developed mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Data Masking, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created mappings which involved Slowly Changing Dimensions Type 1 and Type 2 to implement business logic and capturing the deleted records in the source systems.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 to Teradata.
  • Experience with high volume datasets from various sources like Oracle, Flat Files and Teradata and XML files.
  • Worked with Data Masking transformation to secure data.
  • Used debugger extensively to identify the bottlenecks in the mappings.
  • Modified PL/SQL stored procedures for Informatica mappings.
  • Created Sessions and Workflows to load data from the SQL server, flat file and Oracle sources that exist on server.
  • Used Data Masking transformation to restrict the confidential data while loading into Dataware house.
  • Successfully upgraded Informatica 9.5 and to 9.6 and responsible for validating objects in new version of Informatica.
  • Configured the session properties to increase the performance.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
  • Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
  • Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
  • Performed tuning on sources, targets mappings and SQL (Optimization) tuning.

Environment: Informatica Power Center 9.6.1/9.5.1, Oracle 11g, DB2, Teradata 14/12, Flat Files, XML, Erwin 4.1.2, Sql Assistant, TOAD, Cron job, Maestro, UNIX.

Confidential, San Francisco, CA

Sr.ETL Informatica Developer

Responsibilities:

  • Involved in designing the process for getting the data from all systems to Data Warehousing system.
  • Performed major role in understanding the business requirements and designing and loading data into data warehouse (ETL).
  • Imported various Application Sources, created Targets and Transformations using Informatica Power Center 9.5.1/9.1.1 Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
  • Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Collection of data source information from all the legacy systems and existing data stores.
  • Involved in Data Extraction, Transformation and Loading from source systems to ODS.
  • Developed complex mappings using multiple sources and targets in different databases, flat files.
  • Used various transformations like Unconnected /Connected Lookup, Aggregator, Data Masking, Expression Joiner, Sequence Generator, Router etc.
  • Responsible in the development of Informatica mappings and also tuned for better performance.
  • Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Expertise in using both connected and unconnected Lookup Transformations.
  • Used TOAD, SQL Developer to develop and debug procedures and packages.
  • Worked with Reporting team which utilizes Business Object tool and to address their need from ETL point of view.
  • Set up batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.
  • Extensively used Dynamic lookup to implement the SCD type2 changes.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy using the concepts of Change Data Capture (CDC).
  • Used PMCMD commands of Informatica in UNIX scripts to schedule workflows and jobs.

Environment: Informatica Power Center 9.5.1 / 9.1.1, IDQ 9.1.1, Oracle 10g/11g, PL/SQL, DB2, Toad, Erwin, Business Object 6.0, Unix, Windows.

Confidential, Austin, Texas

Sr. ETL Informatica Developer

Responsibilities:

  • Analysed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Responsible for Impact Analysis, upstream/downstream impacts.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Created and maintained inbound and outbound HL7 interfaces.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Worked on different types of transformations while loading the data from different types of sources like XML, HL7 and flat files to target as per the business requirement.
  • Processing claims through EDI 837 files to FACETS system and also worked on scenarios for complete claims lifecycle.
  • Ability to analyze HL7 message for compliance against a standard specification and identified issues and presented them to the client for resolution
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Loaded data in to the Teradata tables using Teradata Utilities BTEQ, Fast Load, Multi Load, Fast Export and TPT.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Integrated the data into centralized location. Used migration, redesign and Evaluation approaches.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Scheduled Informatica jobs and implementing dependencies if necessary using Maestro.
  • Managed postproduction issues and delivered all assignments/projects within specified time lines.

Environment: Informatica Power Center 9.1.1/8.6.1, Flat Files, Oracle 11g, Teradata 12/13, SQL, PL/SQL, TOAD, SQL Assistant, HL7, Windows XP, Unix, Maestro, SVN.

Confidential, Beaverton, OR

Informatica Developer

Responsibilities:

  • Responsible for complete life cycle implementation that encompassed Business Requirement gathering, analyzing their source systems and then building a new data mart in order to provide functionality for their reporting purposes.
  • Extensively implemented and governed the corporate naming standards according to the company-specified standards.
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts within the fact tables.
  • Used accumulating snapshot grain to handle multiple date keys and late arriving fact rows.
  • Did data profiling in order to understand the data and how different entities relate to each other.
  • Was responsible for all design reviews of mapping according to High level design Mapping Documentation and standards followed for project, tuned informatica sessions to increase the cache size, target based commit interval.
  • Performed Data Migration where the relational data is extracted from different sources (Flat files / Excel sheets / XML files), transformed and loaded into the target Oracle database without any data loss.
  • Transformations like Joiner, Stored procedure, Router, Aggregator, Source Qualifier, Lookup, Expression etc. are used.
  • Data Profiling, Data cleansing and Data scrubbing operation are performed while transforming using the Expression transformation.
  • Created mapplets, workflows and sessions using source, target and transformations to run the mappings in the desired order.
  • Performed database tuning to improve performance by creating and modifying table space.
  • Performed Unit testing, User Acceptance Test and also documented test cases for UAT.

Environment: Informatica Power Center 8.6.1/8.1.1, SQL Server 2008/2005, Oracle, Business Objects, UNIX shell Scripting, Erwin, Toad.

Confidential

Informatica Developer

Responsibilities:

  • Identified the exact business requirements by interacted with the business analysts and other management through JAD sessions and brought up the exact requirements.
  • Followed agile methodology during the development process of the data designing.
  • Designed the conceptual models for the flow of data between different systems.
  • Added enhancements to the data model by following Star schema using Ralph Kimball methodology.
  • Applied data profiling techniques to analyze the content, quality and structure of data from different sources to make initial assessments.
  • Provided with data cleansing techniques that can be used for multiple systems including billing, customer service centers and e-channels.
  • Applied source to target mapping and generated mapping matrix for transformation.
  • Provided sophisticated data management capabilities to ensure consistency and integrity of data for a demanding piece of legislation such as SOX compliances.
  • Performed the customer profiling using number of different classifications which helped the organization to target relevant customers with product and service offers, which helped them to retain the existing customers and add the new customers.
  • Managed the Metadata which controlled the flow of the data to different systems helped the organization to control the fraud and achieved solution for fraud detection in cases of subscription fraud by building the customer behavior profile.
  • Debugged the mapping in Informatica using debug wizard to observe the flow of data using different test case for different types of data.
  • Implemented pipeline Partitioning (hash key, key range, round robin and pass through) to improve session performance.
  • Extensively used the capabilities of Power Center such as File List, Pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
  • Created scripts in UNIX for migration of data between the sources and the target data bases.

Environment: Informatica 8.1.1, Windows XP, Oracle 9.2, TOAD, Unix, Windows.

We'd love your feedback!