We provide IT Staff Augmentation Services!

Sr Informatica Developer/idq & Mdm Developer Resume

2.00/5 (Submit Your Rating)

Columbus, OhiO

SUMMARY

  • Around 9 years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica PowerCenter 9.x/8.x/7.x. Informatica Data Quality(IDQ) 9.6.1, Informatica MDM and Informatica B2B
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience working with Cloud Computing on Platform Salesforce.com
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.
  • Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of the transformation to cleanse and profile the incoming data by using Standardizer, Labeler, Parser, Address Validator Transformations 8 years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Experience in creating batch scripts in DOS and Perl Scripting.
  • Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.
  • Experience in both Waterfall and Agile SDLC methodologies.
  • Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
  • Data modeling experience in creating Conceptual, Logical and Physical Data Models using ERwin Data Modeler.
  • Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Hands on experience in MDM development.
  • Involved in the designing of Landing, Staging and Base tables in Informatica MDM.
  • Created MDM mapping and configured match and merge rules to integrate the data received from different sources.
  • Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
  • Extensive knowledge in all areas of Project Life Cycle Development.
  • Strong analytical, verbal, written and interpersonal skills.

TECHNICAL SKILLS

Databases: Oracle 10g/9i/11i/R12, DB2, MS SQL server 7.0/2000/2005/2008, MS Access 2000/2005, Teradata, MySQL

Languages: Transact- SQL, PL/SQL,HTML, C, C#, PERL, Java

Operating Systems: Windows, Linux, Unix, MS-DOS, Sun Solaris.

OLAP/Reporting Tools: SQL Server Analysis Service(SSAS), SQL Server Reporting Service(SSRS), Share Point MOSS 2007, Business Objects 6.x, Cognos Framework Manager

ETL Tools: Informatica PowerCenter 10.x/9.x/8.x/7.x, Informatica Power Exchange, Informatica Data Quality Suite 9.6,Informatica MDM, SQL Server Integration Services (SSIS)

Data Modeling Tools: Microsoft Visio

SQL Server Tools: SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler

Web Technologies: MS FrontPage, MS Outlook Express, FTP, TCP/IP

Other Tools: Microsoft Office,Visual Basic 6

Scheduling Tools: Tidal, Autosys, Windows Scheduler

Data Quality Tools: Informatica Analyst, Informatica Data Quality, Informatica Developer

MDM Tools: Nextgate, Informatica MDM

PROFESSIONAL EXPERIENCE

Confidential, Columbus, Ohio

Sr Informatica Developer/IDQ & MDM Developer

Responsibilities:

  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Implemented various SSIS packages having different tasks and transformations and scheduled SSIS packages.
  • Created user variables, property expressions, script tasks in SSIS.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
  • Perform Informatica MDM design, implementation, experience on master data management Informatica Data Quality (IDQ 9.6.1) is the tool used here for data quality measurement.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Experienced with Informatica Power Exchange for Loading/Retrieving data from mainframe systems.
  • Building cubes using Confidential schema workbench to design analysis and dashboards, and designing interactive reports with Confidential report designer.
  • Identifying the existing system of financial planning.
  • Responsible for creating the Product data model in Erwin and importing into Informatica MDM Hub Console.
  • Managed all development and support efforts for the Data Integration/Date warehouse team.
  • Performance tuning heavy queries and optimizing Informatica MDM jobs.
  • Expertise withintegration technologies and processes, enterprise data warehouse implementations and reporting tools, integrating multiple legacy and strategic systems.
  • Developed accurate development estimates and drafted project resource plans.
  • Designing solution using Informatica PIM technology.
  • Design user interfacen for Data extract and do ETL legacy data into SQL server database.
  • Experience in Active VOS workflow design and development.
  • Develops software data-flow specifications and interfaces between ERP, website and the PIM.
  • Introduced and implemented an agency wide Data governance program.
  • Customize the EPM product based on the requirement and existing files.
  • Developed ETL process using Confidential PDI to extract the data.
  • Design the Source - Target mappings and involved in designing the Selection Criteria document.
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
  • Responsible for manually start and monitor production jobs based on the business users requests.
  • Responsible to look into production issues and resolved them in timely manner.
  • Developed Informatica process to replace stored procedure functionalities and provide a time effective and high data quality application to the client.
  • Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Created Mappings for Historical and Incremental loads.
  • Assessed the Netezza environment for implementation of the ETL solutions.
  • Collaborated with software architects to ensure aligment of the Netezza environment.
  • Used Jenkins for deploying in different environments.
  • Configured and maintained source code repositories in Clearcase, GIT.
  • Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
  • Project based on Agile SDLC methodology with 2 weeks of software product release to the business users.
  • Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
  • Provide post release/production support.

Environment: Informatica Power Center 10.0, IDQ 9.6.1, Informatica MDM, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts, Teradata.

Confidential, Denver, CO

Sr Informatica Developer/IDQ & MDM Dev

Responsibilities:

  • Worked on Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6
  • Worked with Informatica Data Quality 9.5.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data QualityIDQ 9.5.1.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 (Time Stamping) in the mapping designer to maintain full history of transactions.
  • Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse with Business Rules.
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters.Extracted the source definitions from various relational sources like Oracle, XML and Flat Files.
  • Extensively used ETL to load Flat files, XML files, Oracle and legacy data as sources and Oracle, Flat files as targets.
  • Created Sessions and managed the Workflows using various tasks like Command, Decision, Event wait, counter, Event raise, Email using Workflow Manager.
  • Extensively used the Informatica Debugger for debugging the mappings.
  • Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
  • Worked specifically with the Normalizer Transformation by converting the incoming fixed-width files to COBOL workbooks and using the Normalizer transformation to normalize the data.
  • Worked with Lookup Dynamic caches and Sequence Generator cache.
  • Created Reusable Transformations and Mapplets to use in Multiple Mappings and also worked with shortcuts for various informatica repository objects.
  • Identified and eliminated duplicatesindatasets thoroughIDQcomponents.
  • Used Teradata utilities like FastLoad, MultiLoad and Teradata SQL Assistant.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages in MDM. Integrated IDQ process in MDM.
  • Load data files coming from external vendors onto Teradata EDW using mload and fload utilities.
  • Workedinimplementation of Profiling, Score Card, Classifier models, Probabilistic models, Humantask and Exception record management as part ofIDQprocess.
  • Hands on Informatica MDM and efficient on various Informatica stages database objects
  • Worked with Informatica Power Exchange to pull the changed data in the form of Condense files and load into Teradata tables using Tpump import.
  • Created parameter files with Global variables.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Profile files and shell scripts were used for recreation of dynamic parameter files.
  • Scheduling of Informatica workflows using Tidal Scheduler.
  • Migration of Informatica code from DEV to TEST environments in Informatica by creating deployment groups, folders, applying labels, creating queries in the Informatica Repository Manager.

Environment: Informatica PowerCenter 9.6.1/9.5.1, Informatica IDQ(9.5.1), Informatica MDM, IDD,UNIX, Teradata13.0, Shell Scripts, SQl Server

Confidential, Atlanta, GA

Sr Informatica Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Design, develop, test and review & optimize Informatica MDM and Informatica IDD Applications.
  • Involved inmatch/merge and match rules to check the effectiveness of MDM process on data.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Perform Data Conversion/Data migration using Informatica PowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Worked with XML targets for the data coming from SQL server source.
  • Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
  • Parameterized the whole process by using the parameter file for the variables.
  • Imported xsd file to create the xml target and create the Hierarchical Relationship And normalized views.
  • Implemented the logic by using HTTP transformation to query the web server.
  • Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
  • Created complex Shell scripts for various set of actions that would automate the process of executing the actions like validating the presence of indicator files.
  • Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
  • Involved in Unit testing and system integration testing (SIT) of the projects.
  • Assist the team members with the mappings developed as part of knowledge transfer.

Environment: Informatica PowerCenter 8.6.1/ 8.1.1, Informatica IDQ 9.5.1, MDM 9.5.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files,),Tidal 5.3.1. UNIX, IDQ(IDE)

Confidential, Auburn Hills, MI

ETL Developer/Analyst

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements using SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported the process steps under development, test and production environment

Environment: Informatica Power Center 8.1.4/7.1.4, Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case

Confidential

ETL Developer

Responsibilities:

  • Development, Testing, Implementation & Training ofusers.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Created mappings for Historical and Incremental Loads.
  • Used Version Control to check in and checkout versions of objects.
  • Tuned Source, Target, Mappings, Transformations and Sessions for better performance.
  • Supporting daily loads and work with business users to handle rejected data.
  • Prepared and maintained mapping specification documentation.
  • Use Debugger to test the mapping and fixed the bugs.
  • Used Mapping Variables and Mapping Parameters to fulfill the business requirements.
  • Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
  • Involved to prepare technical and functional specification.
  • Performance analysis and tuning of SQL statements of projects.
  • Imported data from different sources to Oracle tables using Oracle SQL*Loader
  • Wrote Oracle PL/SQL, stored procedures and triggers to populate data.
  • Involved in writing complex SQL statements for reports.
  • Worked closely with client to research and resolve user testing issues and bugs.

Environment: Informatica PowerCenter, ETL, UNIX, PL/SQL, TOAD, Oracle 8i, SQL, SQL*Loader

We'd love your feedback!