Etl - Informatica Consultant Resume
Detroit, MichigaN
SUMMARY
- IT professional with around 8+ years of experience in the Development and Implementation of Data warehousing with Informatica Power Center, OLTP and OLAP using Data Extraction, Data Transformation, Data Loading and Data Analysis.
- Over 6+ years of extensive experience in BI and Extensively used Informatica tools such as Informatica Server and Client tools like Designer, Workflow manager, Workflow Monitor, Repository Manager.
- Extensive experience in design and development of ETL in Informatica 9.1/9.5
- Experience in Informatica Meta Data Manager(MDM), Power Exchange, Change Data Capture, Enterprise Data Quality.
- Involved in the life cycle of the Operational Data Store (ODS), DWH.
- Extensively used ETL to load data from Oracle 10/11g, MS SQL Server 2008/2012, Flat Files, XML files, Teradata.
- Worked on waterfall, Agile/Scrum methodologies.
- Experience in Data Profiling using Frequency Report in IIR (Informatica Identity Resolution) Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
- Expertise in working with various sources such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, Netezza,Teradata, flat files, XML, COBOL, Mainframe.
- Above 4 Yrs of IT experience with planning, installation engineering and administering of LINUX (REDHAT).
- Good understanding in database and data warehousing concepts (OLTP & OLAP).
- Designed, tested, and deployed plans using Informatica Data Quality (IDQ) 8.5.
- Experiences in large scale data integration and data warehouse solutions database exceeding 10 TB.
- Practical understanding of Star Schema and Snowflake Schema Methodology using Data Modeling tool Erwin 4.0/4.2.
- Data Validation and data reconciliation
- Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL&T - SQL, Teradata data warehouse using BTEQ, techniques, MULTI LOAD, and FASTLOAD scripts.
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Used Power Exchange to integrate the sources like Mainframe VSAM, DB2 and XML files.
- Created dashboard style of layouts using different sheet objects like List boxes, Multi boxes, slider, current selections box, buttons, charts, text objects, bookmarks, etc. Also generating various complex reports using Business Objects 5.1/6.1(Supervisor, Designer and WI 2.5/2.6), Micro strategy. understanding of failure and recovery strategies
- Expertise in all phases of testing including Unit Testing, Functional Testing, System Testing, Regression Testing, End to End Testing, Usability Testing, Load/Volume Testing, Performance Testing, and User Accepting Testing.
TECHNICAL SKILLS
Data Warehousing ETL: Informatica Power Center 6.x/7.x/8.x/9.x, Power Exchange, Informtica Data Quality, Informatica Metadta manager
RDBMS: Oracle 11g/9i/8i, MS SQL Server 2008/2005, Teradata 14
Database Languages: SQL, PL/SQL, Unix, Shell Scripting
Data Modeling Tools: Erwin 4.1., SQL Data Modeler, Microsoft Visio
Development Process Models: Waterfall, Agile, Spiral, Prototype Model
Office Package: MS-Office 2013/2010/2007/ XP/2003/2000
Environment: Unix, Windows XP, Sun Solaris
Programming Languages: C
Scheduler Tools: Autosys,BMC Control-M V8.0
Reporting tools: Business Object
Database Utilities: Toad, Data Mirror, PL/SQL Developer
PROFESSIONAL EXPERIENCE
Confidential, Detroit, Michigan
ETL - Informatica Consultant
Responsibilities:
- Analyze the existing stored procedure in DB2 and translate to Teradata Stored Procedures and BTEQ scripts, TPT Scripts.
- Analyze the existing Informatica code in DB2 and converting the mappings and sessions to the Teradata Database.
- Extensive experience in installation, configuration and administration of Red hat Enterprise Linux and physical and virtual machines for versions 5.x and 6.x.
- Expert in building Red hat Linux physical and virtual servers & migrating and/or upgrading Linux servers from one OS release to another.
- Involved in the POC to evaluate the feasibility and effort of implementing Informatica MDM as opposed to utilizing the existing Next gate EMPI and Provider Registry of Big Data.
- Designed, developed, and implemented optimal ETL solutions for automation of client information transformation and movement, using Talend.
- Developed technical Best practices for ETL related activities, including client data movement, quality, and cleansing using Talend.
- Worked with Dates in PL/SQL Dates are a relatively complex scalar data type, in both SQL and PL/SQL.
- Build fault tolerant, self-healing, adaptive and highly accurate ETL platforms.
- Tuning query performance on queries running over billion of rows of data running in an Oracle environment.
- Involved in Data Engineering expertise to multiple teams across our organization.
- Good Experience in bulk data integration and transformation, real time data integration and replication and data quality and governance by using Oracle Data Integration (ODI).
- Experience in implementation and maintenance of VMware, DNS, DHCP, NIS, NFS and SMTP.
- Develop framework for the Data Loads, Audits and Controls.
- Communicate the report changes and work closed with the reporting team.
- Participate in the JAD meetings and document the requirement and solutions.
- Writing UNIX scripts to check data and partition values in Hadoop table.
- Performance turning complex stored procedure.
- Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Worked with Informatica IDQ (Data Analyst, Developer) with various data profiling techniques to cleanse, match/remove duplicate data by using IBM Info sphere.
Environment: Informatica Power center 9.5/ 9.6, Oracle 11g, Teradata 14, IDQ, SQL Server, DB2, Control-M, Hadoop-hive,Pig,HUE, Business Objects, UNIX, JIRA.
Confidential, Atlanta, GA.
Sr Informatica Consultant/ ETL Designer
Responsibilities:
- Developed complex Informatica mappings to load the data from various sources using different transformations like Source qualifier, Connected and Unconnected Lookup, Expression, Aggregator, Joiner, Filter, Normalizer, Rank and Router Transformations.
- Created the Test Plans, Test Cases for the project.
- Worked Informatica power center tools like Source Analyzer, Mapping designer, Mapplet and Transformations.
- Developed Informatica mappings and also tuned for better performance.
- Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Teradata, Oracle, Flat files and loaded into target.
- Designed and developed data warehouse solutions, integrate new files and data’s from other applications into Enterprise data warehouse, Utilize Teradata for reporting, OLAP & history.
- Developed Procedures and Functions in PL/SQL.
- Used Procedure Transformations to invoke Oracle PL/SQL Procedures.
- Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
- Extensively used Informatica to load data from Flat files, Oracle database.
- Extensively performed Data Masking for preserving the referential integrity of the user data.
- Performed Data Encryption on user data and client data for maintaining consistency and security.
- Responsible for Performance Tuning at the Mapping Level and Session level.
- Worked with SQL Override in the Source Qualifier and Lookup transformation.
- Extensively worked with both Connected and Unconnected Lookup Transformations.
- Load balancing of ETL processes, database performance tuning and capacity monitoring.
- Used UNIX to create Parameter files and for real time applications.
- Developed shell scripts.
- Extensively involved in testing the system from beginning to end to ensure the quality if the adjustments made to oblige the source system up-gradation.
- Worked with many existing Informatica mappings to produce correct output.
- Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
- Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
- Prepared SQL scripts for validation of the ETL business logics involved.
- Prepared the SQL scripts and validated the data shown in the reports.
- Responsible for scheduling the Test status calls with users and the project management team.
- Prepared the test sign off documents.
- Detail system defects are created to inform the project team about the status throughout the process.
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
- Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
- Developed Re-Usable Transformations and Re-Usable Mapplets.
- Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.
Environment: Informatica Power Center 9.1.0/8.6.1, Informatica Data Quality (IDQ) 9.1, Flat Files, Mainframe Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.
Confidential, Atlanta, GA
ETL - Informatica Developer
Responsibilities:
- Worked as Informatica Lead for ETL projects to Design, Develop Informatica mappings.
- Worked with InformaticaIDQ(Data Analyst, Developer) with various data profiling techniques to cleanse, match/remove duplicate data.
- Worked with cleanse, parse, standardization, validation, scorecard transformations.
- Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
- Worked with Informatica Power Exchange as well as Informatica cloud to load the data into salesforce.com
- Worked on Informatica Cloud to create Source/Target SFDC connections, monitor, and synchronize the data in Salesforce.com
- Worked on SFDC session log error files to look into the errors and debug the issue.
- Created and edited custom objects and custom fields in Salesforce and checked the field level Securities.
- Created Web services mappings for consumer and Provider, used Webservices consumer transformation, XML parser to parse the incoming data.
- Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
- Used NZSQL scripts, NZLOAD commands to load the data.
- Involve in all phase of SDLC, i.e design, code, test and deploy ETL components of data warehouse and integrated Data Mart .
- Extensively worked with Teradata database using BTEQ scripts.
- Worked with FLOAD, MLOAD, TPUMP utilities to load the data to Teradata.
- Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Normaliser, Union, Filter and Router in Informatica designer.
- Created pre-session, post session, pre-sql, post sql commands in Informatica.
- Used UNIX scripts for file management as well as in FTP process.
- Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
- Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
- Production support for the Informatica process, troubleshoot and debug any errors
Environment: Informatica Power Center 9.1.0/8.6.1, Informatica Data Quality (IDQ) 9.1, Flat Files, Mainframe Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.
Confidential, St. Louis, MO
ETL Developer
Responsibilities:
- Collaborated with Business analysts for understanding the requirements, business analysis and designing of the Enterprise Data warehouse
- Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, DB2, flat files and loading into Staging and Data Ware House Star Schema.
- Involved in massive data cleansing prior to data staging.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Maplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
- Designed and Developed ETL routines, using Informatica Power Center within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Maplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
- Involved Data migration from one environment to another environment using ETLInformaticatool, Data Extraction, Data Cleansing, Data Staging of operational sources.
- Developed complex mappings with shared objects/Reusable Transformations/Maplets using mapping/Maplet Parameters/Variables.
- Implemented Incremental load logic based on workflow variables such as run id, last updated workflow start date and end date
- Configured workflows with Email Task, which would send mail with session, log for Failure of a session and for Target Failed Rows.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Solid experience in debugging and troubleshooting sessions using the Debugger and Workflow Monitor.
- Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools.
- Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL.
- Scheduled Informatica workflows to run at regular intervals.
- Used SQL tools like Query Analyzer and TOAD to run SQL queries and validate the data.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
Environment: Informatica Power Center 8.x, Oracle 11g, DB2, Microsoft Visio 2007, MS SQL Server 2008, Windows XP, UNIX.
Confidential, Plano, TX
ETL Developer
Responsibilities:
- Involved in the full development lifecycle from requirements gathering through development and support using Informatica Power Center Repository Manager, Designer, Server Manager, Workflow Manager and Workflow Monitor.
- Designed and developed complex mappings, Maplets, reusable transformations using various transformations like Source Qualifier, unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, XML Parser/Generator and more.
- Involved in writing various UNIX shell scripts for writing automatedscripts for scheduled queue process.
- Experience in integration of various data sources definitions like SQLServer, Oracle, Flat Files.
- Involved in Unit testing and System testing of the individual.
- Analyzed existing system and developed business documentation on changes required.
- Performed detail data analysis of both data elements requested by the business and data source and documenting data source definition, source-to-target mapping, and logical structures for the data warehouse/data mart.
- Was involved in design, development, and the administration of Oracle data mart, the distribution of global data from the Data Warehouse.
- Good working experience in writing SQL and PL/SQL scripts includingviews and materialized views.
- Experience in writing, testing, and implementation of the triggers,cursors, procedures, and functions at database level using PL/SQL.
- Experience in debugging and performance tuning of sources, targets,mappings and sessions.
Environment: Informatica Power Center 8.x, UNIX, Core FTP, Putty, Oracle 11g, TOAD, Microsoft Visio 2007, MS SQL Server 2000/2005, Windows XP, UNIX.
Confidential
ETL - Informatica Developer
Responsibilities:
- Responsible for running and monitoring Informatica workflows.
- Responsible for issue analysis.
- Interacting with the Onsite team and Client for resolution of issues.
- Involved in understanding the requirements.
- Involved in development of mappings in Extraction and loading phases.
- Involved in implementing complex business rules using Informatica mappings.
- Participated in various Source code reviews.
- Worked on Informatica Power Center tools- Source Analyzer, Target Designer, Mapping Designer, Workflow manager and Workflow monitor.
- Created the mappings using transformations such as the source qualifier, Expression, Lookup, Router, Filter and Update Strategy.
Environment: Informatica 7.1, Oracle 8.X, SQL, UNIX, Oracle tools Export, Import & SQL*Loader