We provide IT Staff Augmentation Services!

Sr.informatica/idq Developer Resume

3.00/5 (Submit Your Rating)

Oklahoma City, OK

SUMMARY

  • Around 8+ years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center Informatica 10.X/9.x/8.x/7.x, Power Exchange, IDQ, Informatica Developer, IDE, MDM, SSIS, OBIEE, QlikView etc.
  • Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2 UDB, Sybase, Teradata 13/12/ V2R5/V2R4, Netezza MS Access, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x, Erwin 4.0/3.5, SQL* Loader, TOAD, Stored procedures, triggers.
  • Strong data modeling experience using Star/Snowflake schemas, Re - engineering, Dimensional Data modeling, Fact & Dimension tables, Physical & logical data modeling.
  • Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse.
  • Involved in Troubleshooting data warehouse bottlenecks, Performance tuning - session and mapping tuning, session partitioning & implementing Pushdown optimization.
  • Extensively worked on various transformations like Lookup, Joiner, Router, Rank, Sorter, Aggregator, Expression, etc.
  • Strong understanding to Data Warehousing, Data Architecture, Data Modeling, Data Analysis, SDLC Methods and GUI applications.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience at Transforming and validating data using SSIS Transformations like Conditional Split, Lookup, Merge Join and Sort and Derived Column for unstructured and redundant data.
  • Hands on experience in creating Jobs, Alerts, setting up MySQL server mail agent for SSIS packages.
  • Expertise in Installing and Managing Informatica MDM, Metadata Manager and Informatica Data Quality(IDQ), Informatica power center.
  • Worked with Informatica Data Quality(IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.
  • Hands on experience with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Experience in Informatica MDM Installation, Upgrade, Bugfix and Migration of data.
  • Experience with MDM Hub configurations - Data modeling & Data Mappings, Data validation, cleansing, Match and Merge rules.
  • Experience with Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP & TPT & have experience in creating Basic Teradata Query definition (BTEQ) scripts.
  • Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer.
  • Developed the DataStage parallel Jobs using various Processing Stages like Join Stage, Connector Stages, Copy Stage, Merge Stage, Aggregator Stage, Filter Stage, Look up Stage, Transformer Stage, Modify, Filter Stage, Funnel Stage, Change capture stage, Row Generator and Column Generator Stage ..
  • Experience in integration of various data sources like SQL Server, Oracle, Teradata, Flat files, DB2.
  • Experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in design and development of Business Intelligence reports using BI tools Business Objects, Congo’s and knowledge in MicroStrategy.
  • Experienced dealing with outsourced technical resources and coordinating global development efforts
  • Excellent communication and interpersonal skills. Committed and Motivated team player with good Analytical skills.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center, Power Exchange 10.1/9.6/9.5/8.6/7.1, Metadata Manager, Power-Exchange, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), Informatica (MDM), Informatica Data Services (IDS), SSIS, DataStage, Salesforce etc.

Databases: Oracle 11g/10g, MS SQL Server 2008/2005/2000, MS Access, IBM, DB2, Teradata V12, Siebel CRM, PeopleSoft.

Reporting Tools: QlikView, OBIEE, Tableau, MicroStrategy, Oracle Analytics, Business Objects XI R2/XI 3.3etc.

DB Tools: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access

Languages: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl, Python.

Operating Systems: UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS.

Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.

Scheduling Tools: Control-M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler.

PROFESSIONAL EXPERIENCE

Confidential, Oklahoma City, Ok

Sr.Informatica/IDQ Developer

Responsibilities:

  • Involved in Business Analysis and Requirements collection.
  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Involved in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
  • Extensively used all Power Center/Power mart capabilities such Target override, Connected, Unconnected and Persistent lookup’s
  • Used Informatica MDM 10.1 tool to manage Master data of EDW.
  • Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
  • Experience in validating data quality & business rules by using Mapping document and FSD to maintain the data integrity.
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems & created reference tables to standardize data.
  • Experience in writing SQL test cases for Data quality validation.
  • Experienced in Extracting, Transforming and Loading data from various data sources like excel, flat file and Oracle to MySQL Server databases using SSIS MySQL Server Integration services.
  • Perform all SDLC phases related to extract, transform, and load (ETL) processes using MySQL Server Integration Services (SSIS) and MySQL Server T-SQL stored procedures within SQL Server 2012 environment.
  • Resolved many issues by applying EBF’s and patches for issues related to informatica load failures.
  • Migration of Informatica Mappings/Sessions/Workflows and Unix scripts to QA, UAT and Prod environments using Harvest and SVN tool.
  • Configured Connection Manager files for SSIS packages to dynamically execute on Quality Analysis server and Production server. Performed Full load & Incremental load with several Data flow tasks and Control Flow Tasks using SSIS.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Experience in end to end Data quality testing and support in enterprise warehouse environment.
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
  • Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 9.1 Integration servers.
  • Involved in daily status call with onsite Project Managers, DQ developers to update the test status and defects.

Environment: Informatica Power Center 10.0/9.6, Power Exchange, Informatica Data Quality 9.6.1, Informatica MDM 10.1, UNIX,Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, DB2, Control tool, SVN, Windows Server, UNIX and CA ESP Workstation Scheduler, UNIX, IDQ(IDE).

Confidential, New Jersey, NJ

Sr. Informatica/IDQ MDM Developer

Responsibilities:

  • Developing the ETL components as well as Oracle procedures, functions & triggers.
  • Configured Audit Trails and Delta Detection in Mapping Settings to load RAW and PRL tables.
  • Created Stage, Load, Match and Merge jobs to load the source system data into staging with help of mappings and to load Base Objects.
  • Defined Trust and validation rules for the base tables.
  • Coordinated with Business team and making them understand Match & Merge and incorporated their requirements.
  • Created IDD/ HM / SFI Queries, Packages (PUT & Display) and Search Queries (Custom Queries)
  • Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
  • Extensively tested the address doctor files and updated them with the new monthly release files from informatica address doctor site.
  • Experienced in creating Tableau Desktop reports, data blending, dual axis, and publishing in server by providing the respective permissions, adjusting the report specifications in higher environments as per business users.
  • Responsible in creating Logical Data Object (LDO) and profiling for multiple sources in informatica developer and analyst.
  • Experienced in creating IDQ mappings using Labeler, Standardizer, Address Validator transformations with Informatica Developer and migrated to Informatica Power Center.
  • Responsible in using Data Integration Hub (DIH) in creating topics and applications to publish and subscribe data.
  • Data governance application for Informatica MDM Hub that enables business users to effectively create, manage, consume, and monitor master data using IDD (Informatica Data Director).
  • Develop numerous mappings using the various transformations including Address Doctor, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser etc.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level by implementing Push Down Optimization, Session level Partition, Ordering the data in Lookup/Join queries.
  • Have created CA7 Jobs to schedule the ETL jobs using Endeavor.
  • Review scope and solution documents with the business team on immediate basis.
  • Seeks sign offs and approval on scope and solution documents. All documents have to be signed by IT and Business Management team.
  • Keep solution and approach transparent to Business and IT Management team. No solution should be implemented without approval from Business and IT management.

Environment: Informatica Power Center 9.6.1/9.5, Information Server (Datastage) 8.5 (Designer, Manager, Director, Administrator), Informatica Data Quality 9.6.1, Informatica MDM 9.6, Oracle 10G, MS Access,Windows NT/2000/XP, SQL Server 2008, UNIX Shell Scripts.

Confidential, East Norriton, PA

Sr. ETL/Teradata Developer

Responsibilities:

  • Documented high and low-level design document specifications for source-target mapping, based on the transformation rules.
  • Documented technical requirements for ETL process and Design documents for each source. Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration.
  • Experience with incremental changes in the source systems for updating in the staging area and data warehouse respectively.
  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Interacted with the Business Personnel to analyze the business requirements and transform the business requirements into the technical requirements.
  • Prepared technical specifications for the development of Informatica (ETL) process to load data into various target tables
  • Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Used Erwin to reverse engineer and refine business data models
  • Administered and worked with various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager, Workflow Manager and Workflow Monitor.
  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales and marketing areas.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc.
  • Designed and Developed several mapplets and worklets for reusability.
  • Implemented CDC using Informatica Power Exchange.
  • Implemented weekly error tracking and correction process using Informatica.
  • Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Developed Teradata BTEQ scripts to Load data from Teradata Staging to Enterprise Data warehouse.
  • Extensively used Stored Procedures, Functions and Packages using PL/SQL.
  • Worked on Teradata Global temporary and volatile tables.
  • Worked with Teradata DBA team to create secondary indexes required for performance tuning of Data mart loads and reports.
  • Created maestro schedules/jobs for automation of ETL load process.
  • Involved in Unit testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Conducted meetings for every deployment to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day-to-day basis
  • Actively involved in the production support and also transferred knowledge to the other team members and Created Business objects functionality from Analytical database to Transactional data warehouse. Created reports using Business Objects full client and Web Intelligence.
  • Extensively worked with Teradata in data Extraction, Transformation and loading from source to target system using BTEQ, Fast Load, and Multi Load.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2/XML and migrated those to Teradata.
  • Involved in Extracting, transforming, loading and testing data from XML files, Flat files, Oracle and DB2 using Data stage jobs.
  • Used Data Stage Manager for importing metadata from repository, new job categories and creating new data elements.
  • Involved writing shell scripts for running Data stage jobs, Using CSV, Excel, text (Hierarchy file for Data stage jobs).
  • Performed the Unit testing for jobs developed to ensure that it meets the requirements.

Environment: Informatica Power Center 9.0.1/8.6, Informatica Data Quality 8.6.1, Informatica DataStage 8.5/8.1 - DataStage Designer, DataStage Director, DataStage Manager, Data Stage Administrator, Quality Stage, Oracle, PL/SQL, UNIX Shell Programming, Erwin, DB2.

Confidential, Omaha, NE

Sr. Informatica Developer

Responsibilities:

  • Designed mapping document, which is a guideline for ETL Coding following Standards for naming conventions and best Practices were followed in mapping development.
  • Extensively used Informatica various tasks like Decision, Command, Event Wait, Event Raise, Assignment, Timer, control, Link and Email.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner, lookups, Stored procedure and data flow management into multiple targets using router transformations.
  • Optimized the SQL override to filter unwanted data and improve session performance.
  • Used Debugger to track the path of the source data & also to check the errors in mapping.
  • Prepared unit testing document covering field to field's validations and source to target count.
  • Scheduling the workflow comprising of different sessions for their respective mappings in order to load data into Oracle database.
  • Handled slowly changing dimensions of Type 1 and Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Conducted Data Analysis, helped Business Leads in understanding and designing new reports.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Automation of job processing, establish automatic email notification to the concerned persons.
  • Created automated process to monitor the space in data directory using Perl and Korn Shell.
  • Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Handled creation, modifications, and documentation of Oracle Packages, Procedures, Functions, and Indexes.
  • Assisted the other ETL Developers in resolving complex scenarios.
  • Involved in promoting the folders from Development to Test and Test to UAT and UAT to Production Environment.
  • Involved in different Team review, Time estimation and UAT meetings.

Environment: Informatica Power Center 8.6, Oracle 11g, Toad, Autosys, PL/SQL, SQL*Plus, XML, Unix Shell Scripting.

Confidential, Atlanta, GA

Informatica Developer

Responsibilities:

  • Responsible for requirement definition and analysis in support of Data Warehousing efforts.
  • Developed ETL mappings, transformations using Informatica Power Center 8.6.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.
  • Developed data Mappings between source systems and target system using Mapping Designer.
  • Responsible for Test and Production issue resolutions.
  • Developed shared folder architecture with reusable Mapplets and Transformations.
  • Extensively worked with the Debugger for handling the data errors in the mapping designer.
  • Created events and various tasks in the work flows using workflow manager.
  • Responsible for tuning ETL procedures to optimize load and query Performance.
  • Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
  • Taken part of Informatica administration. Migrated development mappings as well as hot fixes them in production environment.
  • Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
  • Migrating Informatica Objects using Deployment groups.
  • Trouble issues in TEST and PROD. Do impact analysis and fix the issues.
  • Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
  • Developed Unit test cases and Unit test plans to verify the data loading process and Used UNIX scripts for automating processes.

Environment: Informatica Power Center 9.1/8.6, Power Exchange, Teradata, Access, UNIX,SQL Server 2008, SQL Assistant, DB2.

Confidential

Datawarehouse Developer

Responsibilities:

  • Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process.Datawarehouse Developer
  • Provided data warehouse expertise including data modeling, Extract, Transform and Load (ETL) analysis, design and development.
  • Hands-on Experience in working with Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets to extract, transform and load data.
  • Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Worked with various Active and Passive transformations like Source Qualifier, Sorter, Aggregator, Filter, Union, and Router Transformations, Sequence Generator and Update Strategy Transformations.
  • Handled versioning and dependencies in Informatica.
  • Developed schedules to automate the update processes and Informatica sessions and batches.
  • Resolving technical and design issues.
  • Developed data transformation processes, maintain and update loading processes.
  • Developed and implemented the UNIX shell scripts for the start and stop procedures of the sessions.
  • Used UNIX shell scripts to run the batches.
  • Developed standards and procedures to support quality development and testing of data warehouse processes.

Environment: Informatica power center 8.5,Windows NT/2000/XP, SQL Server, Autosys, UNIX. Shell scripting.

We'd love your feedback!