We provide IT Staff Augmentation Services!

Informatica Idq Lead Resume

0/5 (Submit Your Rating)

Santa Ana, CA

SUMMARY

  • Informatica Power Center/IDQ Developer and Data Warehouse Consultant with over 9 years of IT experience in design, development, and implementation of CoreERP conversion project from SAP to Oracle EBS using ETL toolset of Informatica Power Center, IDQ, Oracle database, SQL server Database, UNIX and scheduling tools like UC4.
  • Industry/ Domain Knowledge - Oil & Gas Industry, Cloud Computing, Financial Banking, Oil and Gas Industry, Healthcare, Payroll Applications and Human Resource Development domains.
  • Experienced in Migration of code from Dev Repository to Stage Repository and Production ultimately. Well versed in writing Technical/Functional Mapping specification Documents along with unit test protocols for future development.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling and monitoring capabilities of IDQ 9.6.1.
  • Experienced in Data Profiling, Analysis, Standardization, Cleansing, Integrate, Score Carding, and handling of Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit. Worked with Address Doctor, different algorithms, Biagram / Jaro / Edit / Hamming / Reverse distance in IDQ.
  • ORACLE / SQL development experience in SQL, PL/SQL, Procedures, Functions, and Packages for creating framework for pre and post load processes including but not limited to Index rebuilds, Database Stats collection, partition creations, table and partition truncates, data de-duplication, and data purging.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Experienced in the use of Agile Methodology, including Test-Driven Development and Scrum with tight deadlines on each Sprint.
  • Technical documentation experience in writing Technical specifications using the available Functional specifications, such as Conversion Execution Protocol (Source to Target Mappings), ETL Design document, Unit Test Protocols, Deployment Checklist, and Production support run book. Good experience in providing duration estimates for the WBS activities involving ETL effort.
  • Team player working within and across cross-functional teams with solid hands-on experience in advanced reports development and Microsoft SQL development.
  • Excellent in communication and interpersonal skills & vivid analytical ability to solve the problems.

TECHNICAL SKILLS

ETL Tools: Informatica (Developer/Power Exchange/Power Center 9.6.1, 9.1,8.6,8.5,8.1.1)

RDBMS: Oracle 11g/10g/9i, Teradata12, Microsoft SQL Server 2008 R2, SQL Server Management Studio, PL/SQL, Teradata, Sybase.

Operating Systems: UNIX (HP UNIX, SUN Solaris, LINUX), Windows NT/98/95, WIN 2000, AIX

Data Modeling: Erwin r7.1/7.2, ER Studio V8.0.1

Other Tools: Toad for Oracle 12.0/11.2, Teradata SQL Assistant, WinSQL, Siebel and Visual Basic 5.0

Reporting Tools: OBIEE 11g, MSRA 02.07.0000 , Business Objects XI R2, Crystal Reports 9, 10 and XI

PROFESSIONAL EXPERIENCE

Informatica IDQ Lead

Confidential, Santa Ana, CA

Responsibilities:

  • Responsible for designing, testing, deploying the Data quality procedures on the source data, which is a SQL server database.
  • Worked on Agile Methodology (Scrum), participated in daily/weekly team meetings, guided two groups of six developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Solution the Data Quality and Data Cleansing tracks for the MDM project.
  • Gathered and documented MDM application, conversion and integration requirements
  • Define and design the Data Acquisition, Transformation, and Data Cleansing approach for the MDM implementation.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Column level profiling of the source data (which was majorly Insurance/Finance related data pertaining to policies and borrower) was carried out using IDQ Analyst tool to help identify possible cleansing areas in the data.
  • Used IDQ 9.6.1 Developer tool to complete initial dataprofiling and matching/removing duplicate data.
  • Built the Logical Data Objects (LDO) and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.6.1
  • Identified and eliminated duplicates in datasets using IDQ 9.6.1 Match components of Edit Distance, Jaro Distance and Mixed Field matcher. It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Carried out assigned Data Quality tasks using various IDQ transformations, Parser, Labeler, Standardizer, Match, Consolidation.
  • Created mapplets when designing the Dataquality rules so as to be able to reuse the commonly used rules like the upper casing or trimming of Blank spaces.
  • Made use of LDO’s (Logical Data Objects), which are like virtual mappings that allow us to apply filters and can be used in multiple profiles where the LDO is the source object.
  • Made use of reference table for standardizing data and to be able to use it as a lookup table instead of hard coding values in the code.
  • Worked in a very fast paced Agile/SCRUM environment in delivering useful results in a short period of time and to be able to embrace changes even at the end of development cycle.

Environment: Informatica Developer Client 9.6.1, SQL Server 2012, SQL/T-SQL, Flat files, XML

Data Integration Lead

Confidential, Houston, TX

Responsibilities:

  • Involved in the conversion of data from SAP R/3 and other legacy Source systems to Oracle EBS using Informatica Power Center 9.1.
  • Played role as a BA and worked extensively with subject matter experts (SMEs) and Business Users in understanding and documenting their requirements. Reviewed functional specifications and confirmed that all business requirements are addressed. Maintained a centralized repository of all documents with version control on SharePoint.
  • Worked with Informatica Data Quality 9.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.1
  • Utilized IDQ 9.1 to complete initial dataprofiling and matching/removing duplicate data.
  • Used different algorithms like Biogram Distance, Edit Distance, Jaro Distance, Reverse Distance, Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using IDQ Analyst tool.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Carried out assigned Data Quality tasks using various IDQ transformations, Parser, Match, Labeler, Address Validator, Standardizer.
  • Performed ETL activity along with functional analysis of wide variety of ERP data (RICEW Objects) involving Accounting to Close (ATC), Human Resource (HTR), Projects to Close (PTC) and Procurement to Pay (PTP).
  • Design, development of mappings, transformations, sessions, workflows and ETL batch jobs to load data from Source to Target using Informatica, Oracle PL/SQL.
  • Create SAP ABAP test documents and work closely with SAP ABAP developers in promoting the Informatica SAP code to SAP QA4, QAS and Production environments before the CAB meetings.
  • Worked with SAP R/3 adaptors to connect to SAP system using Informatica.
  • Generated various ABAP code/documents that were promoted to Test and Production environments in SAP ERP system.
  • Developed a Conceptual model using Erwin based on requirements analysis.
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications.
  • Documentation of Source to Target mapping in the form of Technical Specifications (Conversion Execution Protocol) and develop Unit test plans.
  • Unit test processes to ensure proper functioning, and collaborate with other team members during system and Integration testing phases to ensure proper functioning of the ETL code as part of the entire framework.
  • Migration of Informatica code using Informatica migration tool from Development to QA and Production.
  • Created Informatica mappings using various transformations like SAP BAPI/RFC, Source Qualifier, Expression, Lookup, Aggregator, Stored Procedure, Update Strategy, Joiner, Filter and Router.
  • Adhere to Informatica SDLC, policies, procedures and Industry best practices with respect system design, architecture naming and coding standards.
  • Worked closely with business application teams, business analysts, data architects and database administrators to ensure the ETL solution meets the Business requirements.
  • Help Business users reconcile Source data (SAP) and the converted data (Oracle EBS).
  • Performance improvement changes to the existing code of Informatica, queries using explain plan. Well versed with database partitioning and Informatica mappings performance improvement using partitioning.
  • Scheduling the Informatica workflows using UC4 scheduler and trouble shooting any issues related.
  • Developed shell scripts for job automation, which will generate the log file for every job.
  • Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Assign task to Off-Shore team and validate/test their work.
  • Developed project plans with manager and team members to ensure ETL solutions are executed and delivered as per the plans with good quality and mentor junior members of the team.

Environment: Informatica Power Center, Informatica Developer 9.1/9.0.1, Oracle 11g, OBIEE 11g, SAP R/3, UC4, Erwin r7.1/7.2, ER Studio V8.0.1

Datawarehouse Developer

Confidential, Redmond, WA

Responsibilities:

  • Analyzed business process workflows and developed ETL procedures to move data from source to target systems.
  • Extensive use of Informatica Power Center tools Designer, Workflow Manager, Workflow Monitor, Debugger and Reusable transformations.
  • Experienced in the use of Agile Methodology, including Test-Driven Development and Scrum with tight deadlines on each Sprint.
  • Extracted data from Flat files, SQL server, legacy systems like Azure database, Office 365, MS Sales and load them into Microsoft SQL Server Database (Datawarehouse) through Informatica Power Center as ETL tool.
  • Developed complex T-SQL queries and designed SSIS packages (SQL 2008) to load the data into warehouse.
  • Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Migration of ETL objects from Dev server to test servers
  • Implemented Slowly Changing Dimension (SCD) type 2 to maintain Historical Data, by utilizing lookup and update strategy transformations to lookup values from different tables and update slowly changing dimensions.
  • Build efficient SSIS packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update Strategy
  • Involved in creating, scheduling and running a number of sessions in complex workflows.
  • Review and participate in team meetings involving Revenue STM mappings for Fact and Dimension Tables pertaining to the Microsoft Finance data being available in the Cloud platform.
  • Involved in migrating the data from Informatica 8.6.1 to Informatica 9.1.0
  • Used UC4 scheduler to schedule and automate the workflows to load on daily/weekly/monthly basis.
  • Created stored procedures using PL/SQL and implementing them through Stored Procedure Transformation for Maintenance Tables.
  • Used Mapping Wizards in creating type2 slowly changing dimensions tables to facilitate maintenance of history.
  • Install patch and hot fixes on Informatica Integration Environment
  • Performed Informatica Admin work, like installing EBF patches, upgrading from version 9.0.1 to 9.1
  • Coordinate with Informatica Support on internal issues
  • Extensively used version control while making changes in the Microsoft Sharepoint, Visual Studio, Informatica Power Centre and other tools.

Environment: Informatica Power Center 9.1/9.0.1, SSIS 2008 R2, CA Erwin Model Navigator 7.3.11, SQL Server 2008 R2, SQL/T-SQL, Microsoft OLAP SQL Server, MSRA 02.07.0000 , Microsoft Visual Studio 2010, Microsoft Sharepoint 2010, Flat files, XML

Data Warehouse Consultant

Confidential, Bensalem, PA

Responsibilities:

  • Responsible for gathering the user requirements and discuss with Business Analysts to acquire the functional and Technical specifications
  • Analyzed the source data coming from Oracle ERP system to create the Source to Target Data Mapping.
  • Design and develop SSIS packages, store procedures, configuration files, tables, views, functions and implemented best practices to maintain optimal performance.
  • Utilized SSIS (SQL Server Integration Services) to create a Data Warehouse for reporting.
  • Have used SQL Server 2005 Integration Services (SSIS) transformations in the data flow of a package to aggregate, merge, distribute, and modify data.
  • Interacted with the Data Architect in Creating and modifying Logical and Physical data model.
  • Worked with DBA’s in optimizing the Major SQL Query’s in the process of performance tuning.
  • Extensively developed UNIX Shell scripts to transfer and archive account files.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows.
  • Maintained versions of code and supporting documents in Clear Case (UNIX).
  • Responsible for identifying the bottlenecks and tuning the performance of the Informatica mappings/sessions.
  • Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
  • Designed and developed mapping using various Informatica transformations like Source Qualifier, Sequence Generator, Expression, Lookup, Aggregator, Router, Rank, Filter, Update Strategy and Stored Procedure.
  • Worked with Tidal Enterprise Scheduler to schedule jobs and batch processes.
  • Used Informatica Repository Manager to create Repositories and Users and to give permissions to users.
  • Used Autosys and Informatica Scheduler to schedule jobs for the files and other sources to be extracted and load to target EDW on a daily/weekly/monthly basis.

Environment: Informatica Power Center 8.6.1/8.5, SSIS 2005, UNIX shell script, Oracle 11g, SQL Server 2005/2008, Flat files, Autosys, Erwin 4.0, TOAD 8.6.

Informatica Developer

Confidential, Plainfield, NJ

Responsibilities:

  • Involved indesign and developmentof data warehouse environment, liaison to business users and/or technical teams gathering requirement specification documents and presenting and identifyingdata sources, targets and report generation.
  • Designed Mappings using Informatica Designer, which populated the Data into the TargetStar Schemaon Oracle Instance.
  • Setup the data mart on Oracle database by running theSQL scripts from ERWIN designer.
  • Translation ofBusiness processes into Informatica mappingsfor building Data marts.
  • Involved in theMigration processfrom Development, Test and Production Environments.
  • WroteStored Procedures and Triggersin Oracle database for managing consistency and referential integrity across data mart.
  • Tuning the Mappings forOptimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extractionand Load process andmonitor sessionsusing Informatica Workflow Manager.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.

Environment: Erwin, Informatica Power Center 8.1, Cognos Impromptu, Transformer, MS SQL Server, Oracle 9i/10g, SQL, PL/SQL, SQL*PLUS, SQL*Loader, TOAD, Import/Export Utilities, Shell Scripts.

Data Warehouse Associate

Confidential, Skillman, NJ

Responsibilities:

  • Analysis of Source requirements in existing OLTP systems and Identification of required Dimensions and Facts from the Database.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Parsing high-level design specs to simple ETL coding and mapping standards. Designed Mapping document, which is a guideline to ETL Coding.
  • Extensively used Informatica to load data from Flat Files to Oracle and then to EDW.
  • Created mappings using the transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter, Update Strategy, Joiner, and Stored procedure transformations
  • Worked with various Informatica Power Center tools - Source Analyzer, Mapping Designer and Workflow Manager.
  • Created reusable transformations and Mapplets and used them in complex mappings.
  • Fine-tuned Transformations and mappings for better performance.
  • Used Informatica Designer to design mappings, extracting data from flat files, that were delivered to Decision Support System, DSS
  • Writing Stored Programs (Procedures & Functions) to do data transformations and integrate them with Informatica programs and the existing application
  • Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
  • Performed unit testing and Involved in tuning the Session and Workflows for better performance.

Environment: Informatica Power Center 6.2.1/7.1.1 , Workflow Manager, Workflow Monitor, Erwin 4.0/3.5.2, TOAD, PL/SQL, Flat files, Oracle 8i

We'd love your feedback!