Informatica Consultant Resume
Denver, CO
SUMMARY
- Over 8 years of work experience in Data warehouse and Business Intelligence Projects involving Informatica as an ETL tool.
- Extensively worked in Informatica tools like Designer, Workflow manager, Repository manager, Monitor, Admin Console, Informatica Developer, Informatica Analyst, Informatica Data Quality and Data Explorer.
- Extensively used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate - wide-ETL Solution using Informatica Power Center 9x/8x/7x.
- Have very good experience working on Informatica Address Doctor.
- Knowledge of full life cycle development for building a data warehouse.
- Good understanding of relational database management systems like Oracle, SQL Server and ERP systems like SAP. Have extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems and SAP systems and flat files.
- Have worked on extracting data from SAP using Informatica Power Exchange ABAP Technologies in the power center mappings.
- Have good experience working on Data Quality tool Informatica Developer.
- Have Fair knowledge of Cognos Reporting Tool.
- Strong knowledge of Data Warehousing concepts and Dimensional modeling.
- Have good experience working in UNIX environments. Hands on expertise on shell scripts and scheduled workflows in UNIX environments.
- Good in Data Integration, Data Analysis, Data Profiling, Data Management, Troubleshooting, Reverse Engineering and Performance tuning in various level of ETL Process.
- Having good experience in handling large volume of data, Complex business logic and Aggressive deadline environments.
- Worked in execution of project development life cycle phases like Analysis, Design, Coding & Unit testing, Performance Tuning, Production Deployment, and Maintenance.
- Strong communication skills and have good experience in communicating with Customers.
- Have exposure to Open Source DW tools like Kettle, Pentaho Reporting, and Jasper Reports.
- Have good experience in offshore/onsite model.
TECHNICAL SKILLS
ETL Tools: Informatica 8.1, 8.6.1, 9.1.0
Data Quality Tools: Informatica Developer (IDQ), Informatica Analyst.
Extraction Tech: Power Exchange for SAP (ABAP).
Reporting Tools: Cognos 8 Report Studio.
Databases: Oracle 9i, 10g, SQL Server.
Open Source Tools: Kettle, Pentaho Reporting, Jasper Reports and BIRT.
Operating Systems: Windows XP, UNIX, Sun Solaris.
Other Programming Skills: Java, HTML, C++.
Scheduling Tools: Maestro, Tidal scheduling.
Others: CVS, PVCS, StarTeam, HP Quality centre, SharePoint, TOAD, and Eclipse.
PROFESSIONAL EXPERIENCE
Confidential, Denver, CO
Informatica Consultant
Responsibilities:
- Involved in meetings with Confidential team to understand the Data Quality issues. Gathered all the rules that need to be applied on the data to ensure the Data Quality is maintained.
- Designed all the data quality rules and implemented them on data as re-usable mapplets in Informatica Developer (IDQ) which can be used in any data quality mappings.
- Have exported all the rules as mapplets into Informatica power center.
- Plugged all the rules that need to be applied on a specific table and build the mapping for the table. Developed several such mappings to implement all the data quality rules.
- Developed all the data synchronization mappings which compare data from source system to target system. Built system to store the detailed information of the discrepancies from source system to target system which helps users to analyze the data quality issue.
- Have built the Data Quality framework to execute around 300 custom rules in an iterative fashion.
- Loaded all the Rules into the Meta data driven table which has a capacity to control the execution of the rules one after the other.
- Involved in doing the data profiling activity on the business critical tables using Informatica Analyst service 9.1.0. This profiled data will help users to analyze the patterns in data and define more rules on data.
- Developed a web service which has the capability to consume addresses and cleanse the Address data using address doctor. This web service is designed to be used by any application across the organization.
- Used Google’s geo coding service to get the geo coding information for each address.
- Scheduled all the jobs using power center scheduler in workflow manager.
- Supported all the migration activity from development to qa to prod.
- Support and maintenance of all the data quality rules that are developed in production.
- Have documented mapping documents, test plans for all the code developed and knowledge transition documents and plans.
Environment: Informatica Developer 9.1.0, Informatica Power center 9.1.0, Informatica Analyst, Informatica Address Doctor, Oracle 10g, SQL Server, UNIX, PL/SQL Developer, StarTeam.
Confidential, Piscataway, NJ
Informatica Consultant
Responsibilities:
- Understand the existing data warehousing system and gather the requirements for the extension of Data warehouse tables. Such as addition of new columns to critical DW tables.
- Created separate design for implementing the new columns into the existing tables as these columns are added for only one source system INTERACT.
- Implemented the design as separate mappings from the actual mappings that load into the data warehouse. Tested the developed code using HP Quality Center system.
- Have proposed a new design to do the reconciliation process for the critical data warehouse tables against several source systems.
- Implemented several mappings to complete the reconciliation process using Informatica Power center.
- Created complex PL/SQL stored procedures for tracking the Data mart status and update the information into Database.
- Scheduled all the reconciliation jobs in Tidal scheduler and extensions to the existing tables in the Maestro scheduler.
- Developed UNIX scripts to track the completion of data mart status based on the touch files generated and created Cognos reports to send the mart status to the business users.
- Developed several documents for the project as demanded by the compliance requirements of Confidential & Confidential Inc.
- Production deployment of all the jobs created for the project. Monitoring and support of the jobs during execution.
Environment: Informatica power center 8.6, UNIX, Oracle, Cognos Report Studio, Informatica Power Connect, ProC, Mainframes, SAP, TOAD, PVCS, Maestro scheduler, Tidal Scheduler, HP Quality Center.
Confidential
Informatica Consultant
Responsibilities:
- Involved in developing the ETL layer of the project using Informatica Power center as a tool.
- Have been involved in designing the staging database where all the SAP data is loaded into the staging layer.
- Implemented several Informatica ABAP mappings to extract data from SAP and load it into the staging layer from where data is loaded picked into the DDW Layer.
- Involved in performance tuning of ABAP extract mappings which involve complex join conditions and tables with large volumes of data. Successfully tuned ABAP mappings to extract data from SAP using Informatica.
- Developed several mappings to load data from staging layer to DDW layer. DDW layer is developed as a complete image of SAP data maintained in Oracle database.
- Involved in developing the mappings required to load data from DDW layer to the Conform layer where all the Conformed dimensions and Facts are present.
- Have applied all the business transformations on the data at DDW layer before loading the data into the Conformed schema.
- Involved extensively in designing & coding for all the phases of the project.
- Worked with several transformations like Application Source Qualifier, Lookup, Filter, Router, and Update strategy, Joiner, Expression, Aggregate, Sequence Generator and Source Qualifier Transformations.
- Extensively used SQL queries for testing the data from source system to the target system in the process of testing the mappings.
- Developed several documents for the project as demanded by the compliance requirements of Confidential & Confidential Inc.
- Worked closely with Database Administrator to investigate into the performance issues caused by some of the tables which are having large volumes of data
Environment: Informatica power center 8.1, UNIX, SAP ECC, ABAP Extracts, Oracle 10g, Informatica Power Connect, TOAD, PVCS, Maestro scheduler, Tidal Scheduler, HP Quality center.
Confidential
ETL Developer
Responsibilities:
- Participated in meetings with business users to understand the requirements to build the reject subject area. Have built the data model required for the reject subject area including summary tables.
- Gathered IT requirements from the business users, prepared the Systems Design and Architecture and built test cases to ensure a successful product delivery meeting the deliverable deadline.
- Involved in creating the Logical and Physical data model of database tables required for the Reject Subject area. Design the Data Warehouse tables. Define the granularity of the tables, define Update Strategy.
- Used Informatica as an ETL tool to extract data from source systems to target systems.
- Have developed wide variety of Informatica mappings from simple to extremely complex mappings to satisfy the requirements.
- Implemented Type-I and Type-II Slowly Changing Dimensions to override and preserve the history of the records which in turn reflect the business requirements.
- Developed PL/SQL scripts as per the IT requirements.
- Worked with several transformations like Lookup, Filter, Router, and Update strategy, Joiner, Expression, Aggregate, Sequence Generator and Source Qualifier Transformations.
- Development of Mappings, Sessions, Workflows using Informatica Power Center designer, workflow manager.
- Involved in performance tuning of Informatica Sessions, mappings and source qualifier queries.
- Created documentation for the mappings, workflows which are developed for the ETL loads.
Environment: Informatica power center 7.1, Oracle 9i, UNIX, Cognos, CVS, TOAD, and Windows XP.
Confidential
ETL Developer
Responsibilities:
- Participated in meetings with business users to understand the various loading aspects required for data warehouse.
- Gathered IT requirements from the business users, prepared the Systems Design and Architecture and built test cases to ensure a successful product delivery meeting the deliverable deadline.
- Used Informatica as an ETL tool to extract data from source systems to target systems.
- Have developed wide variety of Informatica mappings from simple to extremely complex mappings to satisfy the requirements.
- Implemented Type-I and Type-II Slowly Changing Dimensions to override and preserve the history of the records which in turn reflect the business requirements.
- Developed PL/SQL scripts as per the IT requirements.
- Worked with several transformations like Lookup, Filter, Router, and Update strategy, Joiner, Expression, Aggregate, Sequence Generator and Source Qualifier Transformations.
- Development of Mappings, Sessions, Workflows using Informatica Power Center designer, workflow manager.
- Involved in performance tuning of Informatica Sessions, mappings and source qualifier queries.
- Created documentation for the mappings, workflows which are developed for the ETL loads.
- Involved in creating the Logical and Physical data model of database tables required for the Reject Subject area. Design the Data Warehouse tables. Define the granularity of the tables, define Update Strategy.
Environment: Informatica power center 7.1, Oracle 9i, UNIX, Cognos, CVS, TOAD, and Windows XP.