We provide IT Staff Augmentation Services!

Informatica Power Center/data Quality Lead Resume

5.00/5 (Submit Your Rating)

HarrisburG

PROFESSIONAL SUMMARY:

  • Having 10+ years of IT experience in Analysis, Design, Development, Testing, Implementation and Support of Data Quality, Data Integration & Data Migration Projects using Informatica Power Center, Data Quality software tools.
  • Experienced in working with various relational data bases like Oracle, DB2, Netezza and SQL Server, Teradata.
  • Experienced in migration project from Mainframe to Oracle through power center and IDQ
  • Extensive experience in ETL (Extraction, Transformation and Loading) of data using Informatica Power Center from heterogeneous sources like Flat files (Fixed width, delimited), XML files, Relational databases.
  • Extensive experience with Banking and Finance, Transportation, Insurance and retirement business domains.
  • Extensive knowledge in SDLC (Software Development Life Cycle) and executed projects in Waterfall and Agile methodologies.
  • Strong understanding of Dimensional Modeling, Star, Snowflake Schema, OLAP and DW concepts.
  • Worked closely with the data stewards to get data quality project requirements.
  • Experienced in profiling data sources and creating scorecards to monitor the data quality.
  • Created DQ mapping to validate customer address using Address Validator transformation.
  • Created DQ mapping to handle duplicate, Bad and Golden records using IDQ transformations.
  • Experienced in implementing standardization and cleansing activities using IDQ transformations.
  • Experienced in using Parser, Labeler, Key Generator, standardizer, Address Validator, Match, Duplicate, Bad Record Exception, Consolidator data quality Transformations.
  • Deployed IDQ objects to integrate with Power center.
  • Experienced in understanding the Project scope, Dependencies, Risks and provide estimations.
  • Experienced in coordinating and leading onsite - offshore development model.
  • Co- ordinated, guided and lead the teams effectively in technology migration from DB2 to Netezza and Informatica 8.6 to 9.1.
  • Experienced with command utilities like NZLOAD, NZSQL, DB2 LOAD, EXPORT, Teradata Fast Export and SQL Loader commands to query and populate DB2, Netezza, Teradata and Oracle databases.
  • Extensively involved in Solution Approaches, Design walk through with Port Folio Architects and Technical Delivery director.
  • Experienced in setting up Shared Informatica infrastructure and UNIX/WINDOWS platform in co-ordination with Infrastructure partner team.
  • Co-ordinated and led the interactions across Portfolio Architects, Enterprise Data Modeling Architects, DBA and Business teams in finalizing Business requirements, ETL architecture and data modeling.
  • Extensive knowledge in RDBMS concepts and Relational and Dimensional Data modeling.
  • Extensively involved in writing Unix Shell Scripting programs like Source file validations, Files upload/Download to/from remote server, Source/Target Files Archival/purge requirements and to run ETL batch jobs from UNIX using PMCMD command.
  • Experienced in scheduling of ETL jobs using Control-M.
  • Independently perform complex troubleshooting, root-cause analysis and provide solution to fix defects.
  • Experienced in writing sql queries using OLAP functions to improve the performance of the queries.
  • Experienced in creating and using Stored Procedures, Functions, Views and Materialized views.
  • Expertise in analyzing the source data systems for better understanding the business requirements, designing optimal data model and ETL design.
  • Developed technical specifications of the ETL process flow, High level and Low-Level Design and Source to Target mapping documents.
  • Designed and Developed Audit, Control and balancing process.
  • Created complex mappings using transformations like Source qualifier, Expression, Aggregator, Joiner, Look up, Normalizer, SQL, Union, Update strategy, Stored Procedure, Filter, Router, XML Generator and Parser, Java,HTTP and Web services Consumer Transformations.
  • Strong experience in handling slowly Changing Dimensions.
  • Experience with Informatica Advanced Techniques - pushdown optimization, session partioning.
  • Extensively involved in creating reusable objects to standardize Informatica code and to bring down the development and testing efforts.
  • Expertise in developing the flexible ETL code for maintenance and future enhancements by adopting parameterization.
  • Extensively involved in Optimization and Tuning of mappings and sessions in Informatica by identifying and eliminating bottlenecks at Source, Transformations, Target layers.
  • Extensively involved right from Project engagement to till Production Warranty support of the project.
  • Prepared Project support handover document and transitioned to Support team to ensure smooth execution of the project in Production.
  • Generated warning and Error reports and shared with Business users to further analysis.
  • Supported and guided Testing and Business users while SIT, UAT in progress and obtained sign off from SIT, UAT Teams.
  • Extensively involved in generating automated alerts to Business users via IMR and Email notifications as per Business requirement.
  • Extensively involved in migrating Informatica objects to Test and Production environments by creating Deployment groups.
  • Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.
  • Team player, Motivated, able to grasp things quickly with analytical and problem-solving skills.
  • Comprehensive technical, oral, written and communicational skills.

TECHNICAL SKILLS:

ETL: Informatica Power Center 9.x/8.x, Informatica Data Quality 9.6 (IDQ), Source Analyzer, Target designer, Transformation developer, Mapping Designer, Mapplet designer, Workflow Monitor, Workflow Manager, Informatica Developer, Analyst Tool

Data modeling tools: Erwin 7.x, Microsoft Visio

Operating Systems: Windows, Linux, UNIX (IBM AIX)

Database: Oracle, IBM DB2, Netezza, Sql Server, Teradata

Languages: SQL, PL/SQL, C, C++, Unix Shell Scripting

Tools: TOAD, SQL developer, Aginity work bench, Rally, HPQC, Service Now

PROFESSIONAL EXPERIENCE:

Confidential, Harrisburg

Informatica Power Center/Data Quality lead

Responsibilities:

  • Worked closely with the business teams to understand the business and data quality requirements and prepared data quality rules and ETL Source to Target mapping documents.
  • Good in understanding the copy books to deal with IMS DB Segments and prepared Field start and end position document for each source
  • Designed and delivered Exception and Audit frame work to capture exception records and record reconciliation
  • Provided estimations for the ETL Tasks
  • Good in data profiling to understand data and identify the data anomalies and report with business to get cleansing rules
  • Initiated and led the discussions with data modelers to design the data model and to create database tables.
  • Involved in creating the technical design documents and design walk through with data architects.
  • SQL Loader connection used to process large volume of records to bringdown the job run time
  • Developed the informatica mappings using Source Qualifier, Expression, Joiner, Look up, Sql, Router, Union, Normalizer, Java, Stored Procedure transformations.
  • Provided informatica and guidance to team to perform tasks
  • Generated error and warning reports to help business teams for further analysis of data.
  • Tuned Informatica mappings to increase the through put using various performance tuning techniques.
  • Created reusable objects /mapplets to standardize Informatica code and to bring down the development and testing efforts.
  • Created store procedures, Functions, Views as part of the business logic implementation.
  • Reviewing the Code developed by the team members and provided feedback.
  • Involved in test data preparation for SIT, UAT phases.
  • Deployed the Informatica code to test and production environments.
  • Participated in daily stand up calls and kick off meeting sessions at the begin of each iteration.
  • Participated in Show and Tell and Retrospective meetings.

Environment: Informatica Power Center 10.1,Data Quality 10.1, Oracle 11g, SQL Developer, Window

Confidential, Columbus

Sr. Informatica Developer

Responsibilities:

  • Worked closely with the business teams to understand the requirements and prepared ETL Source to Target mapping documents.
  • Initiated and led the discussions with data modelers to design the data model and to create database tables.
  • Involved the in creating the technical design documents and design walk through with data architects.
  • Developed the informatica mappings using Source Qualifier, Expression, Joiner, Look up, Sql, Router, Union, Normalizer, Stored Procedure transformations.
  • Provided informatica to team and guided them as and when necessary.
  • Generated error and warning reports to help business teams for further analysis of data.
  • Tuned Informatica mappings to increase the through put using various performance tuning techniques.
  • Created reusable objects /mapplets to standardize Informatica code and to bring down the development and testing efforts.
  • Created store procedures, Functions, Views as part of the business logic implementation.
  • Reviewing the Code developed by the team members and provided feedback.
  • Involved in test data preparation for SIT, UAT phases.
  • Deployed the Informatica code to test and production environments.
  • Provided estimations for the ETL Tasks.
  • Participated in daily stand up calls and kick off meeting sessions at the begin of each iteration.
  • Participated in Show and Tell and Retrospective meetings.
  • Worked with the Scheduling/Operations teams for scheduling the ETL jobs on ESP tool.

Environment: Informatica Power Center 9.6, Oracle 11g, Teradata 14, Linux, Perl, TOAD, GitHub, Putty, Teradata SQL Assistant.

Confidential, Columbus

Sr. Informatica/IDQ Developer

Responsibilities:

  • Worked closely with the data stewards to understand the data quality, data integration, Profiling rules and score card requirements.
  • Profiled data sets to understand the structure and quality of the contents in different dimensions and shared the profile results with data steward which helps him to define business rules to ensure the data quality at an acceptable level.
  • Created mapplets using IDQ to implement the standardization and cleansing rules.
  • Created rules using IDQ to apply to various profiles.
  • Created score cards to monitor the quality of the target systems periodically.
  • Handled bad records / duplicate records and shared to Data Steward using Human Task.
  • Created DQ mappings to identify Golden records and address validation.
  • Created IDQ objects using parser, labeler, expression, standardizer, key generator, match, consolidator and Bad/Duplicate record Exception transformations.
  • Deployed IDQ objects to integrate with power center.
  • Created power center mappings to further enrich the data by integrating the with various master data sources.
  • Performed various data validations and cleansing activities and data standardizations and integration with MDM tables and transformed data as per Key requirement.
  • Generated the flat (fixed and delimited) files from Database and provided to external teams.
  • Participated in daily stand up calls and kick off meeting sessions at the begin of each iteration.
  • Participated in Show and Tell and Retrospective session for each release.
  • Promoted informatica code from development to test and production environments.

Environment: Informatica Power Center 9.6, IDQ 9.6, SQL Server 2014, Windows

Confidential, Salt Lake City

Tech Lead/Sr. ETL Developer

Responsibilities:

  • Co-ordinated and led the interactions across Portfolio Architects, Enterprise Data Modeling Architects, DBA and Business teams in finalizing Business requirements, ETL architecture and data modeling and translate the requirements into Technical specifications.
  • Worked on Technical documentation like preparing High Level Design Documents, Low Level Design Documents and Source to Target mapping sheets.
  • Involved in the Logical and physical data modeling and prepared the Source data volumetric information with DBA to plan Indexes, table spaces, hashing and partition keys.
  • Preparation and Review of Unit Test cases and executing Unit test cases in QC and documenting the results.
  • Managing and guiding the team in design and development the ETL artifacts.
  • Reviewing the Code developed by the team members and provided feedback.
  • Involved in test data preparation for SIT, UAT phases.
  • Participating in Reviews with the customers to get the sign off on design and mapping documents.
  • Involved in technical supporting, troubleshooting data warehouse application to meet Service Level Agreement (SLA's)
  • Handling of CR's (Change Requests) and enhancements for existing application and followed change management process.
  • Created Informatica mappings using Source qualifier, connected/Unconnected Lookup, joiner, Sorter, Aggregator, Union and Router transformations to extract, transform and load the data to Staging, ODS, Data ware house and finally data mart area to enable Reporting capabilities.
  • Created persistent look up caches to avoid building the same cache multiple times where ever applicable there by improving the performance of the ETL jobs.
  • Created re-usable transformations/mapplet and used across various mappings.
  • Created re-usable user defined functions to perform business validations and used across various Projects with in the portfolio.
  • Worked on Slowly Changing Dimensions as per business requirement.
  • Designed and developed Audit, Balancing and Control process.
  • Worked on Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and identifying performance bottlenecks from logs.
  • Implemented Informatica concurrent workflow execution to improve the performance.
  • Wrote UNIX Korn shell scripts for file transfer/archiving/Email notifications.
  • Tuned SQL queries at source qualifier and Lookup transformation to cache required volume of data by avoiding unwanted data wherever possible to minimize the data transfer over the network.
  • Guiding and Supporting during Assembly/Integration Test, QA and UAT in progress.
  • Has done the Data validation and error analysis and Provide resolutions / feedback to data experts for the data to be corrected.
  • Provide knowledge transfer and prepared KT documents to Production Support for ongoing maintenance of the code.
  • Worked with the Scheduling/Operations teams for scheduling the ETL jobs on Control-M tool.

Environment: Informatica 9.6, IBM Netezza, Rally, HPQC, UNIX, Aginity Workbench, Putty.

Confidential, Phoenix

Sr. ETL Developer

Responsibilities:

  • Involved in envisioning phase calls with business users, SME’s and Source teams to understand the requirements and identifying the sources.
  • Involved in the requirements mapping from source to target, design and ETL process flow diagrams.
  • Engaged various teams like data model, Database and the shared infrastructure, Testing teams to kick start the development activities.
  • Initiated and Led the discussions with Enterprise Data modeling, DBA teams to design the data model and Database.
  • Involved in the Logical and physical data modeling and prepared the Source data volumetric information with DBA to plan Indexes, table spaces, hashing and partition keys.
  • Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Appeared before the PRB (Project Review board) for each phase of the project to check project health checkup and to get Approvals from PFA’s and sign off from Tech Director before commencing next phase.
  • Performed various data validations and cleansing activities before loading data into STAGE layer and standardizations and integration with confirmed dimensions at Data ware house and transformed data as per reporting requirement before loading to DATA MART layer.
  • Created mappings using Source qualifier, Expression, Lookup, Router, joiner, Aggregator, Union, Update strategy, SQL, Java transformations.
  • Implemented and maintained slowly changing dimensions as per business requirement.
  • Tuned SQL queries at source qualifier and Lookup transformation to cache required volume of data by avoiding unwanted data wherever possible to minimize the data transfer over the network.
  • Lookup persistent cache mechanism has been used to share the cache across sessions during the batch jobs processing to reduce the overall system resources and processing time of the jobs.
  • Involved in the Historical, Daily, catch up loads once the project launched in live environment.
  • Generated out bound flat files (Delimited and Fixed width) and XML files to share with downstream application teams.
  • Exported and Imported the data from the database using Db2, NZ command utilities like LOAD, Import, NZ SQL, NZ LOAD
  • Wrote complex SQL queries to address data purging requirement.
  • Has done the Data validation and error analysis and Provide resolutions / feedback to data experts for the data to be corrected.
  • Provided knowledge transfer and prepared KT documents to Production Support for ongoing maintenance of the code.
  • Worked with the Scheduling/Operations teams for scheduling the ETL jobs on Control-M tool.
  • Write Shell script to transfer source files from remote server, Source files validation, Archive/Purge requirements and kick off TL batch jobs from UNIX environment.
  • Participated in Daily stand up calls and Kick off meeting sessions at the begin of each release.
  • Participated in Show and Tell and Retrospective session for each release.
  • Supported in test data preparation for the SIT, UAT and fixed identified defects by the teams.
  • Created RFC’s, Deployment groups for the code migration to Development and Production environments.
  • Monitored jobs during warranty phase and handed over to Support team with support hand over document and KT sessions.

Environment: Informatica Power Center 8.6/9.1, IBM DB2, IBM Netezza, HPQC, Rally, UNIX, TOAD, Aginity Workbench, Putty

Confidential, Chicago

ETL Developer

Responsibilities:

  • Involved in Development and implementation and maintenance of the project.
  • Involved in Business Users Meetings to understand the requirements.
  • Involved in low level design and source to target mapping documents preparation.
  • Involved in the Logical and physical data modeling.
  • Imported sources and Targets structures from the DB to Informatica Project shared folder.
  • Involved in the preparation of the Unit and Assembly test cases and uploaded and passed them in the QC.
  • Performed Unit testing, Assembly testing before code migration to Testing environment.
  • Actively participated in the SIT, UAT phases and supporting the teams in their daily tasks.
  • Involved in ETL process of extracting the data from Files and DB and loaded to the Target DB.
  • Developed various mappings using Source Qualifier, Aggregator, Lookup, Filter, Router, Joiner, Expression, Stored Procedure, Sorter, Union and Sequence Generator transformations.
  • Created complex mappings which involved Slowly Changing Dimensions.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Involved in preparation of the migration and Implementation text documents and Hand over documents.
  • Monitored Workflows and Sessions in Workflow Monitor during the warranty period and reported and resolved identified issues.
  • Involved in preparation of the Configuration files, Parameter files.

Environment: Informatica 9.1, Oracle 11g, HPQC, UNIX, SQL Developer, Putty.

We'd love your feedback!