Senior Informatica Idq Developer Resume
West Chester, PA
SUMMARY
- Having 10+ years of experience in development, maintenance, support and enhancement in Informatica Power Center, Teradata, Informatica Data Quality (IDQ),Informatica Big Data Management (BDM), Oracle, Hadoop Hortonworks, Informatica Data Validation Option (DVO).
- Have a clear understanding of Data Warehousing concepts with emphasis on ETL and life cycle development.
- Extensive experience in Extraction, Transformation and Loading of Data into Enterprise Data Warehouse using Informatica Power Center 9.x, 8.x,7.x versions.
- Strong experience in Designing the Workflows, Mappings, creation of Sessions and scheduling them using Informatica PowerCenter 9.x, 8.x, 7.x versions.
- Experience with working on Teradata utilities like BTEQ, Fast Export, Fast Load to export and load data to/from different source systems including flat files.
- Varied experience in Designing and developing the complex mappings from various Transformations like Source Qualifier, Joiner, Aggregator, Router, Filter, Expression, Lookup, Sequence Generator, Update Strategy, Normalizer and more.
- Experience in Debugging sessions and mappings, Performance Tuning of the Sessions and mappings, implementing the complex business rules, optimizing the mappings.
- Proficiency in data warehousing techniques for Data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment.
- Extensive experience in Informatica Data Quality (IDQ).
- Implemented Exception Handling Mappings by using Data Quality, Data Profiling, Data cleansing and data validation by using IDQ 10.x/9.x..
- Experience in implementing Data Quality rules on Hive Data sources using Informatica Big Data Edition 10.x
- Experience in extraction of data from various Heterogeneous sources (Relational database, Flat Files, SAP Source Systems, XML Files) to load into Data Warehouse/Datamart targets.
- Experience in Implementing the Informatica MDM (Master Data Management) process for Customer and Product Master Data Implementation.
- Knowledge inmainframe (COBOL,JCL,SQL,DB2,VSAM,FILE - AID,TSO/ISPF,CA7(Scheduler Tool), ENDEVOR,XPIDETOR,SPUFI,IBM UTILITIES).
- Proficient in understanding business processes/requirements and translating them into technical requirements.
- Worked with DBMS like Oracle 9i/10g/11g, MS SQL Server.
- Experience in Database development skills using Oracle PL/SQL to write stored procedures, functions, packages and triggers.
- Extensive development, support and maintenance experience working in all phases of the Software Development Life Cycle (SDLC) especially in Data Warehousing and Business Intelligence.
- Prepare/Maintain documentation on all aspects of ETL processes, definitions and mappings to support knowledge transfer to other team members.
- Excellent interpersonal and communication skills, customer skills, documentation skills and desire to work in dynamic and challenging environments.
- Work exceptionally well in a team or in an individual environment (self starter).
- Ability to learn new technologies and tools quickly.
TECHNICAL SKILLS
ETL Tools: InformaticaPower Center 9.X/8.X/7.X, Teradata, Informatica Data Quality(IDQ) 10.x/9.x, Informatica Big Data Edition 10.x Power Exchange, Informatica MDM, Informatica DVO, Address Doctor
Databases: Teradata, Oracle 9i/10g/11g, DB2, SQL Server 2005/2008,Hive
Reporting Tools: Cognos, OBIEE
Data Access Tools: Teradata,TOAD,Oracle,SQL Developer
Programming Languages: COBOL, JCL, SQL, PL/SQL, Unix Shell scripting
Version Control Tools: Sub Version, SharePoint, GitHub
Scheduling Tools: Autosys, Robot, Informatica Job Scheduler,CA7, UC4
Operating Systems: MS DOS, Win 98/2000/XP/7/8, UNIX, Linux
Other Tools: Win SCP, Putty, JIRA, HP ALM, HP Quality Control
PROFESSIONAL EXPERIENCE
Confidential, West Chester, PA
Senior Informatica IDQ Developer
Responsibilities:
- Worked with business analyst & users to understand the requirements, analyze business requirements, and translate them into Data Quality Rules.
- Implemented Data Quality Rules using Teradata, Oracle, SQL server and Hive Data sources.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow.
- Developed rules and mapplets that are reusable and will be used in different mappings
- Worked on Sorter, Filter, Expression, Consolidation, Match, and Address validator transformations.
- Worked on parameterization in mappings and workflows.
- Worked on UC4 as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings
- Worked very closely with QA team all the time in order to fix the issues, completed tasks effectively in time, and proceeded to production level.
- Managed Production deployment tasks like creating SMOP in Deployment manager tool and coordinated with Deployment Team.
- Worked closely with Operations team to troubleshoot on Production issues and fix them on time with less Business impact.
- Worked on sending daily reports to Business users on Data Quality status for the rules implemented.
Environment: Informatica Data Quality(IDQ) 9.6.1, Informatica Data Quality(IDQ) 10.1.1, Teradata 15.10, Oracle 11g, UC4, Windows, WinSCP,Toad, Big Data Edition 10.1.0, Hadoop Hortonworks, Hive 2.4
Confidential, Charlotte, NC
Senior Developer
Responsibilities:
- Analyzed business requirements, performed Impact Analysis, created technical design specifications, developed code, performed Code Deployment and provided production support.
- Developed various transformations like Source Qualifier, Update Strategy, Lookup & Expression transformation, Expressions and Sequence Generator for loading the data into target table.
- Implemented the Change data capture (CDC) using the Informatica Power Exchange.
- Improved performance by using Explain Plan, Creating appropriate indexes, queries optimization, utilizing tablespaces and partitioning schemes.
- Scheduled the Workflows using the Job Scheduler Skybot.
- Responsible for identifying the missed records in different stages from source to target and resolving the issues.
- Extensively worked in the performance tuning for mappings and ETL procedures at both mapping and session level.
- Worked closely with database administrators and application development team(s) on the design and implementation of the database.
- Used BTEQ, FLOAD, FEXPORT to build the ETL logics and for data load utilities.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing.
- Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data cleansing and data validation by using IDQ.
- Installing and Configuring of Informatica MDM Hub Console, Hub Store, Cleanse and Match Server, Address Doctor, Informatica Power Center applications.
- Created Landing tables, Staging tables, BO tables, Mappings, transformations for the MDM development.
- Setting Trust configuration and Validation Rules, Created Match Rules and Merge Settings for MDM process.
Environment: Informatica Power Center 9.1/9.5, Informatica Power Exchange, Teradata 15.0, Informatica Data Quality(IDQ) 9.5, Informatica DVO 9.5, Toad, Oracle 11g,, Autosys, PL/SQL (Stored Procedures, Triggers, Packages), Informatica MDM, Address Doctor
Confidential
Senior Developer
Responsibilities:
- Created different transformations for loading the data into target database using e.g. Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, Sequence Generator.
- Involved in Informatica administrative work such as creating Informatica folders, repositories and managing folder permissions.
- Analyzed source data and gathered requirements from the business users.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Coordinating with source system owners, day-to-day ETL progress monitoring, and maintenance.
- Extensively used various tasks like Session task, Event Wait task, Decision task, Email task, Command task.
- Migration, maintenance and support for Informatica interfaces/projects
- Extensively involved in performance tuning for Informatica and Database side.
- Involved in Optimizing the Performance by eliminating Target, Source, Mapping, and Session bottlenecks.
- Created Reference Mappings by implementing SCD (Type1 and 2).
- Extensively involved in writing SQL queries, creating PL/SQL functions, procedures and packages.
- Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.
- Created mapping documentation for all the existing mappings and tuning of these mappings.
- Used Mapping Parameters and Variables, Reusable Transformations.
- Involved in creating Pre and Post Session SQL statements.
- Scheduled the Workflows using the Job Scheduler Autosys.
- Used Sub version tools to maintain the different version of the Informatica Code and documents.
- Worked with InformaticaData Quality (IDQ) toolkit, Analysis, data cleansing.
- Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data cleansing and data validation by using IDQ.
Environment: Informatica PowerCenter 8.6, Power Exchange, Sub Version, Oracle 10g, SQL, HP Quality Control, Skybot, Unix Shell Scripting, Windows NT, Toad, Flat Files, Address Doctor, Informatica Data Quality(IDQ).
Confidential
Informatica Developer
Responsibilities:
- Involved in the design, development and implementations of the Enterprise Data Warehouse (EDW) process and Data mart
- Worked on the analysis, design and development to ensure the successful implementation of the data loading processes.
- Set up connection parameters for the source and target using Relational Connections and to hold the path for source and target.
- Involved in analysis and performance of mappings/sessions. Increased performance by tuning the transformations and discussing database issues with the DBA.
- Worked with Data Extraction from Relational DBMS and Flat Files.
- Developed Mappings using Active Transformations such as Aggregator, Filter, Joiner, Router, Sorter, and Union Transformations.
- Also used Passive Transformations such as Expression, Lookup, and Sequence Generator in various mappings.
- Created and used reusable Transformations for easier maintenance of the data.
- Integrated data cleansing processes within the Mappings. Performed debugging of Mappings using the Debugger for failed Mappings.
- Created Workflows and Sessions to carry out loading process into the target.
- Created and used Workflows using different Tasks like Session, Command and Timer.
- Worked with Workflow and Session Log files to troubleshoot the encountered errors.
Environment: Informatica Power Center 8.1, Oracle 10g, Teradata,TOAD, Unix shell scripting.
Confidential
Informatica Developer
Responsibilities:
- Analyzed the sources, Targets, Transformed the data and Mapped the data and loading the data into Targets using Informatica.
- Worked with Data Extraction from Relational DBMS and Flat Files.
- Developed Mappings using Active Transformations such as Aggregator, Filter, Joiner, Router, Sorter, and Union Transformations.
- Also used Passive Transformations such as Expression, Lookup, and Sequence Generator in various mappings.
- Created and used reusable Transformations for easier maintenance of the data.
- Integrated data cleansing processes within the Mappings. Performed debugging of Mappings using the Debugger for failed Mappings.
- Analyzed the sources, Targets, Transformed the data and Mapped the data and loading the data into Targets using Informatica.
- Testing and maintaining of Informatica Mappings.
- Involved in Integration test cycles.
- Created tables, views, synonyms and sequences. Used SQL Loader for data loading.
- Optimized queries using rule based & cost based approach.
Environment: Informatica Power Center 7.2, Oracle 9i/10g, SQL/PLSQL, TOAD and UNIX.