We provide IT Staff Augmentation Services!

Lead Informatica Data Quality Developer Resume

2.00/5 (Submit Your Rating)

Birmingham, AL

SUMMARY:

  • 6+ years of experience in Data warehousing using Informatica as an ETL tool.
  • Exposure to various domains like Insurance, Banking, Retail Industry and Financing.
  • Conversant with all phase of Software Development Life Cycle (SDLC) involving System Analysis, Design, Development and Implementation.
  • Good at Relational Database Management System (RDBMS) concepts.
  • Wide range of experience in Software development, Project management, Data Integration, Master Data Management and Quality assurance
  • Experience in Data Extraction, Transformation and Loading data using Informatica Power Center 9.6/9.5.1/9.1.1/8. x/7.x.
  • Extensive experience in Master Data Management (MDM) 10.0, Data Quality (IDQ) 9.6.1.
  • Worked on tools like Repository Manager, Workflow Manager, Workflow Monitor, Designer consists of objects like Mapping Designer, Transformation Developer, Mapplet Designer, Source Analyzer and Target Designer, Admin console and Data Administration.
  • Experience in designing complex mappings using Source Qualifier, Update Strategy, Expression, Sequence Generator, Aggregator, Filter, Joiner, JAVA, Router, Connected Lookup, Unconnected Lookup, Normalizer, Rank, XML, Sorter, Transaction Control and Union to load data into different target types from different sources like Flat files (Delimited files and Fixed width files) like text files or CSV files, XML, Oracle, IBM Mainframe, Cobol files, SQL server, DB2, Teradata, Netezza.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Designing & developing cleansing and standardization scenarios using IDQ Cleanse Adapter.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Worked on Source data Exception Handling forData Quality, Data Profiling, Data cleansing and data validation by using IDQ.
  • Experience in managing Slowly Changing Dimensions (SCD) Type 1, Type 2 and Type 3, Data Marts, OLAP, OLTP and Change Data Capture (CDC).
  • Worked on Workflow Manager by creating Sessions/tasks, Workflows, Worklets, Reusable Tasks like Sessions, Command, Email and Non - Reusable tasks like Decision, Event Wait and Event Rise.
  • Good at Pre-SQL and Post-SQL at session level in Informatica.
  • Strong experience in using Informatica Data Quality (IDQ) tool. Created IDQ Mapplets for address cleansing, telephone cleansing and SSN cleansing. Worked with most of the IDQ transformations like standardizer, Parser, merger etc in Mapplets.
  • Experience in exporting the objects from IDQ to Power center and use them in Powercenter Mappings and Mapplets.
  • Knowledge on Data Cleansing, Data Staging, Data profiling, Error Handling, Session log files, Workflow log files and Performance optimization like Pushdown Optimization (PDO), session partitioning, which troubleshoots the bottleneck problems at different levels like source, target, mapping, session and system.
  • Worked on different databases like Teradata, Oracle, DB2, Sybase, SQL Server, MS Access.
  • Good Knowledge of Data Warehousing/Data Modeling using ERwin, staging tables, stored procedures, Functions, cursors, Dimension tables, Fact tables, Surrogate Key, Primary keys, Foreign Keys, Star Schema, Snow Flake Schema, Triggers and Normalization/Denormalization.
  • Experienced in performance tuning like creating indexes at database level.
  • Experience in Scheduling Tools like Autosys, Control-M, and Workload manager.
  • Worked with versioning tools like Clearcase, SVN, PC Based version control
  • Good Knowledge on Teradata utilities like Multi Load, Fast Load, TPump, FastExport, and BTEQ scripts.
  • Good Knowledge of writing PL/SQL procedures.
  • Experience in using Oracle Development tools such as Tool for Oracle Application Development (TOAD).
  • Good Knowledge on Oracle utilities like SQL Loader.
  • Hands on experience in writing Simple/Complex SQL Queries sing Sub queries and multiple table joins using left, right, inner joins.
  • Experience in UNIX Shell Scripting and batch scripting for parsing files.
  • Experience in using the Informatica command line utilities like PMCMD, PMREP to execute workflows.
  • Good communication skills both verbal and written, hardworking, self-motivated, ability to work independently or co-operatively in a team, eager to learn and ability to grasp quickly.
  • Good experience in ETL technical documentation.

TECHNICAL SKILLS:

O/S: Windows NT/XP/98/2000, Unix, Linux

ETL Tools: Informatica PowerCenter 9.6/9.0.1/8.6/7.1.1/6. x, Informatica Data Quality 8.6, Informatica Data Explorer 8.6, Informatica Developer 9.0.1, Informatica Analyst 9.0.1, Informatica Power Exchange 8.6, Informatica MDM Hub Console 9.x, Informatica Mapping Architect for Visio 9.0.1, Informatica9.0.1 Administration, Power center Web Services, Informatica Cloud, SSIS, Pentaho

Database, Languages & Tools: Oracle 11g,10g/9i/8i, Sql, Pl/Sql, SQL Server R2/2008/2012, MS-Access, Info Sphere, Teradata, BTEQ, Unix Shell Scripting, Golden gate, Oracle streams

Tools: and Utilities: Toad, Sql Developer, Putty, Samba, PVCS, SVN, UC4, Autosys, HP Quality Center11, IBM RTC, Axway, WLM, UDO, SAP

Methodologies: Star Schema, Snowflake Schema

Reporting Tools: Cognos, Business Objects

PROFESSIONAL EXPERIENCE:

Confidential, Birmingham, AL

Lead Informatica Data Quality Developer

Responsibilities:

  • Interacted with Business Analyst to understand the requirements and translated them into appropriate technical requirement document.
  • Coordinating with the client team on daily basis for issues and status update.
  • Conducting technical design presentation to the client and getting the sign off.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Designed mappings using different transformations like Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Informatica Power center designer.
  • Used IDQ (Informatica Data Quality) as data Cleansing tool to create the IDQ Mapplets and to correct the data before loading into the Target tables.
  • Created IDQ Mapplets for address cleansing, telephone cleansing and SSN cleansing and used them as Informatica Mapplets in Power Center.
  • Written complex queries in Teradata to transform the data at DB end which could be used as SQL override in SQ transformation. Also written Stored Procedures as per requirement.
  • Created profiles on the Source table to find the Data anomalies using IDQ.
  • Analyzed different data sources like Oracle, Flat files including Delimited and Fixed width like text files, XML files from which the contract data and billing data is coming and understand the relationships by analyzing the OLTP Sources and loaded into Teradata warehouse.
  • Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of customers.
  • Responsible for performance optimization by writing SQL overrides instead of using transformations, implementing active transformation like filter as early as possible in the mapping, selecting sorted input when using Aggregator or Joiner transformations .
  • Implemented performance tuning at all levels like source level, target level, mapping level and session level.
  • Created and configured workflows, worklets and sessions using Informatica Workflow Manager.
  • Created command task at the workflow level by writing the commands to get the Flat files having same structure i.e. Indirect File Loading and at the end of the task moving all the Flat files using command task to an archive directory.
  • Used workflow level variables for the reusability of the code.
  • Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
  • Worked with SQL sub queries and joins between multiple tables.
  • Worked on Design Error Handling process in ETL.
  • Created reusable tasks at workflow level.
  • Created reusable transformations at mapping level in power center designer.
  • Worked with version control tool using SVN.
  • Creating Procedures and packages based on BA requirement.
  • Working on simple and complex Queries based on Business Requirements
  • Creating Jill scripts based on data load requirements.
  • Creating DML scripts based on requirements
  • Scheduled job using Autosys.
  • Worked with Oracle tool like TOAD.
  • Worked with Teradata utilities like Fastload,Tpump.
  • Coordination of system/Integration/UAT testing with other teams involved in project and review of test strategy
  • Closely moved with the Micro strategy reporting team and helped them to get the data for creating report.
  • Have performed code promotion from development level to production level.
  • Worked on agile methodology.
  • Prepared an ETL technical document maintaining the naming standards.

Environment: Informatica Power Center 9.6, Data Quality 9.6, Oracle 11g, Cognos, Netezza, Web Services, Teradata v13, Teradata utilities Windows server 2012.

Confidential, Jefferson City, MO

Informatica Developer

Responsibilities:

  • Worked with Business Analyst to know the requirements of the project.
  • Prepared design document for customer review.
  • Team management and administration in resource planning and management, administrative activities in compliance with Oracle policies and procedures.
  • Coordinated with Informatica Administrator to setup the environment and to move objects to different repositories and folders.
  • Worked on Informatica Power Center tools - Source Analyzer, Target designer, Mapping Designer, Workflow Manager, Workflow monitor, Mapplet Designer and Transformation Developer.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality(IDQ).
  • Created the complex mappings, sessions and the workflows as per the Functional and Technical Specifications.
  • Extracted the data from Flat files (Delimited and fixed width) like CSV files or text files, XML files, Oracle 11g, SQL server, IBM mainframe, Sybase and load into Oracle data warehouse.
  • Worked extensively on different types of transformations like source qualifier, joiner, filter, router, expression, Aggregator, Router, filter, XML, JAVA, Update strategy, lookup, Normalizer, stored procedure transformations etc.
  • Participate in the business analysis and design discussions with customers and internal SMEs
  • Implemented Slowly Changing Dimensions methodology to keep track of historical data.
  • Implemented Change Data Capture (CDC).
  • Designed complex mappings using connected/Unconnected/Dynamic Lookups taken from source and target level, Filter, Expression, Aggregator, Normalizer, Joiner and Router transformations for populating target tables.
  • Management of implementation track/module of implementation projects independently to ensure timely, defect free delivery of solution
  • Involved in Relational and Dimensional Data Modeling Techniques to design data models. Using ERwin tool.
  • Used Informatica PDO (Push Down Optimization) as performance tuning which pushes all the transformation logics to the underlying target database and hence achieves maximum performance offered by the database.
  • Worked with the Data warehousing methodology of Snow Flake schema of Ralph Kimball.
  • Worked with different kinds of Code migration like XML import and XML export.
  • Worked with PL/SQL procedures, functions, cursors, stored procedures and triggers.
  • Worked on ETL design, development, administration, source to target mappings, data warehouse transformations, on call support, troubleshooting and documentation.
  • Implemented mapping parameters and variables for the reusability of the code.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Worked on Oracle utilities, scheduler tools and versioning tools.
  • Worked on waterfall methodology.

Environment: Informatica Multidomain MDM 9.1.0, IDQ, Informatica Power Center 9.1,Oracle 11g,SQL Server 2008,Sybase,XML, Linux, Erwin, Shell scripts, SQL loader, SQL assistant, Autosys, PC based version control.

Confidential, North Quincy, MA

ETL/Infomatica Developer

  • Responsibilities:
  • Worked with Business Analyst to know the requirements of the project.
  • Prepared design document for customer review.
  • Team management and administration in resource planning and management, administrative activities in compliance with Oracle policies and procedures.
  • Coordinated with Informatica Adminstrator to setup the environment and to move objects to different repositories and folders.
  • Worked on Informatica Power Center tools - Source Analyzer, Target designer, Mapping Designer, Workflow Manager, Workflow monitor, Mapplet Designer and Transformation Developer.
  • Created the complex mappings, sessions and the workflows as per the Functional and Technical Specifications.
  • Extracted the data from Flat files (Delimited and fixed width) like CSV files or text files, XML files, Oracle 11g, SQL server, IBM mainframe, Sybase and load into Oracle data warehouse.
  • Worked extensively on different types of transformations like source qualifier, joiner, filter, router, expression, Aggregator, Router, filter, XML, JAVA, Update strategy, lookup, Normalizer, stored procedure transformations etc.
  • Participate in the business analysis and design discussions with customers and internal SMEs
  • Implemented Slowly Changing Dimensions methodology to keep track of historical data.
  • Implemented Change Data Capture (CDC).
  • Designed complex mappings using connected/Unconnected/Dynamic Lookups taken from source and target level, Filter, Expression, Aggregator, Normalizer, Joiner and Router transformations for populating target tables.
  • Management of implementation track/module of implementation projects independently to ensure timely, defect free delivery of solution
  • Involved in Relational and Dimensional Data Modeling Techniques to design data models. Using ERwin tool.
  • Used Informatica PDO (Push Down Optimization) as performance tuning which pushes all the transformation logics to the underlying target database and hence achieves maximum performance offered by the database.
  • Worked with the Data warehousing methodology of Snow Flake schema of Ralph Kimball.
  • Worked with different kinds of Code migration like XML import and XML export.
  • Worked with PL/SQL procedures, functions, cursors, stored procedures and triggers.
  • Worked on ETL design, development, administration, source to target mappings, data warehouse transformations, on call support, troubleshooting and documentation.
  • Implemented mapping parameters and variables for the reusability of the code.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Worked on Oracle utilities, scheduler tools and versioning tools.
  • Worked on waterfall methodology.

Environment: Informatica Power Center 8.6.1,Oracle 11g,SQL Server 2008,Sybase,XML, linux, ERwin, Shell scripts, SQL loader, SQL assistant, Autosys,.

Confidentials, New York

ETL/Informatica Developer

Responsibilities:

  • Participated in collecting user requirements.
  • Interacted with business representatives for requirement analysis and to define business and functional specifications.
  • Involved in drafting Software Requirement Specification for the project.
  • Co-ordinated with Informatica Admin to setup the environment and to move objects to different repositories and folders.
  • Worked on Informatica Power Center tools - Source Analyzer, Target designer, Mapping Designer, Workflow Manager, Mapplet Designer and Transformation Developer.
  • Designed and created mappings, sessions and workflows and schedule the workflow as per the Functional & Technical specifications.
  • Extracted the data from Oracle10g, Netezza, XML, Flat files, DB2 load the data in to Oracle warehouse.
  • Worked on batch scripting.
  • Designed mappings using connected/Unconnected Lookups, Filter, Expression, Aggregator, Update strategy, Joiner, Normalizer, Transaction Control, Source Qualifier, Union, Rank and Router transformations for populating target tables as per business requirements.
  • Developed Index Cache and Data Cache in cache using transformation like Rank, Lookup, Joiner, and aggregator Transformations.
  • Worked with Change Data Capture (CDC).
  • Worked on Data staging.
  • Worked with Shortcuts across Shared and Non Shared Folders.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Target Database.
  • Worked on scheduling tool and versioning tool.
  • Worked on Oracle utilities.
  • Analyzed Session log files to resolve errors in mapping and managed session configuration.
  • Created, configured, scheduled and monitored the sessions and workflows on the basis of run on demand, run on time using Informatica Power Center Workflow Manager.
  • Worked on performance tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Used mapping variables and parameters for the reusability of the code.
  • Maintained naming standards and warehouse standards for future application development, and also created functional and technical specification documents.
  • Involved in designing snow flake schema, star schema, dimension tables, fact tables.
  • Worked with Surrogate keys, primary keys, foreign keys, triggers, normalization/DE normalization types.
  • Worked on agile methodology.

Environment: Informatica Power Center 8.6.0,Oracle10g, DB2, PL/SQL, SQL loader, Toad.

We'd love your feedback!