We provide IT Staff Augmentation Services!

Etl Idq Developer Resume

Eagan, MN

SUMMARY

  • An Astute 8+ years of specialized experience in IT industry as an ETL developer in different domains like HRMS/Financial/Banking &Retail Domains of IT projects following the SDLC process such as analysis, design, coding, testing and deployment.
  • Extensive experience with ETL tool Informatica9.x/8.xin designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, Configuring and Installing Informatica power center tool and scheduling the Workflows and sessions.
  • Over 7+ years of extensive Database experience using Oracle 11gR2/11g/10gR2/10g/9i, DB2 9.x/8.x,MS Access 2007, SQL Server 2012/2008R2/2008, Maria DB and Strong Experience on database interfaces like Oracle SQL Developer, PL/SQL Developer, TOAD, SQL Plus, Oracle SQL Loader, Import/Export, Datapump, Secure FX, Secure CRT, WinScp&Putty.
  • Well experienced with IBM AIX UNIX V7.x/V6.x/V5.x and Red Hat Linux (RHEL) 6.x/5.xOperating Systems.
  • Involved in Requirement gathering, Data analysis, ETL design and developing and preparing Technical design documents.
  • Involved in Logical and Physical data modeling of the data warehouse using Erwin 9.x/7.x Data modelling tool.
  • Good Understanding of Data warehouse concepts and principles - Star Schema, Snowflake and Operational Data store (ODS).
  • Expert-level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables & legacy system files.
  • Extensive experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected & Unconnected lookups, Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Sorter, Unconnected/Connected Lookups etc.,
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
  • Solid experience in writing SQL queries,Oracle, PL/SQL Stored Procedures, SQR, Functions Cursors, Indexes and Triggers and in Query Optimization.
  • Setup the data mart on Oracle database by running the SQL scripts from ERWIN designer.
  • Extensive experience on InformaticaIDQ 10.x/9.x for data profiling, data enrichment and standardization.
  • Involved in analysis, design, development, customizations, implementation and support of software applications in Master Data Management (MDM) 9.x.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Well experienced with UNIX Shell scripting, Crontab& Perl.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Extensive experience in Performance Tuning of targets, sources, transformations, mappings and sessions, Co-coordinating with DBAs and UNIX Administrator for ETL tuning.
  • Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
  • Extensive knowledge on SAP and data extracted from a legacy system into SAP.
  • Knowledge on enterprise wide business management systems such as SAP ERP and PeopleSoft HRMS System.
  • Good experience working with scheduling tools like Autosys11.x & Knowledge in Control-M and Tidal.
  • Expertise in converting HRMS data from legacy system to a new PeopleSoft environment.
  • Excellent Administration experience, have handled the conversion projects using Informatica.
  • Knowledge on PeopleSoft 9.x/8.x, People Tools 8.x, Data Mover, Component Interface, SQR’s 6.x
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Knowledge on Data Processing, Data Cleansing, Data Validation and Data Archiving Process.
  • Involved in 24 / 7 production support.
  • Demonstrated ability to quickly grasp new concepts, both technical and business related and utilize as needed.
  • Extensive experience isoffshore co-ordination, task allocation, client reporting.
  • Excellent Communication and interpersonal skills.
  • Flexible and versatile to adapt to new environment and hardworking and self-motivated.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.x/8.x(Designer, Workflow Manager, Workflow Monitor, Workflow Manager, Informatica Admin Console& Power Connect),Informatica IDQ 10.x/9.6.x/9.0.x/8.x, MDM 9.x, OLAP, OLTP &Informatica MDM.

DimensionalData Modeling: Star Schema Modeling, Snowflake Modeling, SCD Type 1, SCD Type 2 & Type 3,FACT and Dimensions Tables,Physical and Logical Data Modeling, ERWIN Data Modeling tool 9.x/7.x

RDBMS: Oracle 11gR2/11g/10gR2/10g/9i, DB2 9.x/8.x, SQL Server 2012/2008R2/2008/2005, MS Access2007.

Languages: SQR, SQL, PL/SQL, Shell Scripting, Crontab&Perl.

Scheduling: Atomic(UC4) 9.x, Autosys 11.x, Control M & Tidal

ERPTools/DataMigration Tools: PeoplesoftApplication Designer, Data mover, Component Interface, Quest Central, Toad, Secure FX, Secure CRT &PL/SQL developer.

Reporting: SSRS, Crystal Reports &Cognos 11.x

Environment: s: IBM AIX UNIX V7.x/V6.x/V5.x, Red Hat Linux (RHEL) 6.x/5.x&Windows 2007/Vista/XP.

PROFESSIONAL EXPERIENCE

Confidential, Eagan, MN

ETL IDQ Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the MDM HUB.
  • Involved in various knowledge transfers to understand the business and application programs and document them for internal team referencing.
  • Design database table structures for transactional and reference data sources.
  • Involved in analysis of source, target systems and data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Extracted data from heterogeneous sources like DB2, Oracle, SQL Server, Fixed Width and Delimited Flat Files and transformed into a harmonized data store.
  • Worked on IDQ, MDM IDD tools for address profiling, data enrichment and standardization for various countries like US, Korea, Japan, India etc.,
  • Involved in address profiling and Data Quality Analysis of the data given by customer to find out anomalies in the data and highlighting the effectiveness of the tool (IDQ) to do that. Also involved in resolving address doctor and address doctor mapping issues.
  • Designed and developed Mappings in IDQ and MDM for loading MDM HUB.
  • Extensively used IDQ Transformations like Address Validator, Aggregator, Association, Case converter, Comparison, Consolidation, Custom Data, Decision, Expression, Filter, Java, Joiner, Labeler, Lookup, Match, Merge Transformations, Parser, Sorter, SQL Transformation, Standardizer, Union, Weighted Average Transformations to design and develop mappings.
  • Created mappings &mapplets for data cleansing in IDQ.
  • Developedreusabletransformationsandmappletstransformations.
  • Created IDQ mapplets to do address validation on customer data by using various transformations.
  • Used Address validator transformation forvalidating various customers address from various countries.
  • Involved in cleansing and extraction of data and defined quality process using IDQ and created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data.
  • Extensively used IDQ, which helped in standardization and address validation for various countries.
  • Worked on Oracle, SQL Server and Maria DB connections with Informatica.
  • Used Match and Consolidator transformation in IDQ, which helped in reducing the duplicates.
  • Created Power center mappings to do direct load from EDW to PREMDM databases. And load from templates toPREMDM - DB2 databases.
  • Worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Extracted the data from ETL tool and different source systems imported in oracle tables and transform the data to the landing tables and created base objects and staging tables.
  • Created mappings move to the data from landing tables to staging tables and staging to base objects and used Trust and Validation rules.
  • Created Jobs, did the cleanse Functions, Match and Merging process.
  • Involved in match merging process by using exact match and fuzzy match logic types.
  • Extensively used Transformations like Source Qualifier, Expression, Filter, Aggregator, joiner, Look up, Sequence Generator, Router, Sorter and Stored Procedures, Java Transformations to design and develop mappings.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Involved in fixing of invalid Mappings using debugger and fixed the bugs.
  • Experience with Performance tuning, testing of Stored Procedures and Functions, testing of Informatica Sessions, Batches and the Target Data.
  • Developed slowly changing dimension (SCD) mappings for loading Dimensions and Facts.
  • Involved, Conducted and participated in process improvement discussions, recommending possible outcomes focusing on production application stability and enhancements.

Environment: & Tools Used:Informatica Power center 9.5.x, IDQ 10.x/9.6.x, Informatica MDM 9.0.x, DB2, Oracle 11gR2/11g, SQL SERVER 2008 R2/SQL Server 2012, SSMS 2008R2/2012, Maria DB, IBM DB2 9.x, Toad 3.8 data point, Erwin 9.x, RHEL 6.x, IBM AIX Unix V7.x, Atomic(UC4) 9.x, SSRS &BI, Hadoop 1.3.x, WinScp/Secure CRT 7.2.x, Mputty and Quality Center(ALM) 12.x.

Confidential, Minneapolis, MN

Sr. ETL Informatica Developer

Responsibilities:

  • Involved in Requirement gathering and data analysis of source, target and dependent systems.
  • Involved in Logical and Physical data modeling of the data warehouse.
  • Extensively developed Data Ware House tables for various real time scenarios that would arise in an OLTP system.
  • Involved in reviewing Data MappingSpecification Document to create and execute detailed mappings. The data mapping specifications document specifies what data will be Extracted, Transformed and Loaded into the Data Warehouse.
  • ETL Design, development & testing of Informatica processes mappings & workflows.
  • Responsible for developing ETL process based on Data MappingSpecification Document by writing complex SQL Queries.
  • Developed complex Informatica mappings to transform and load the data from various source systems and developed shell scripts.
  • Implemented various Informatica transformations like Sorter, Filter, Router, Expression, Aggregator, Ranker, update strategy, joiner, Lookup, Sequence generator, Transaction Control and Source Qualifier.
  • Implementedslowly changing dimensions conversions (SCD Type 1 & 2).
  • Managing ETL operations: Job-stream definition and management, parameters, scheduling, monitoring, communication and alerting.
  • Maintained sources, target definitions and all Informatica objects using Informatica Repository Manager.
  • Extensively used IDQ which helped in debugging and also reducing development time.
  • Used Match and Consolidator transformation in IDQ which helped in reducing the duplicates.
  • Created several session and workflows in IDQ which we deployed in Power Center.
  • Using IDQ analyst tools we validated the performed exception handling and reporting.
  • Design of ETL audit, balance and control framework and Error handling framework of the project.
  • Performing theImplementation, Unit testing of the build, pre-production activities and post-implementation activities, Plan Production deploymentand verifying the mapping functionality.
  • Performance Tuning of production mappings and Informatica jobs and worked on Error Handling.
  • Responsible for migrating code to various environments.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Involved in report generation from data marts and provided assistance to downstream reporting teams and business users.
  • Compared and validated Reports Generation with Query Outputs using SQL Scripts, SSRS & Crystal Reports.
  • Apply relevant software engineering processes (reviews, testing, defect management, configuration management, and release management), service delivery and service support processes.
  • Preparation of test plans, support SIT & UAT testing cycles and resolving defects.
  • Onsite-offshore coordination and scheduling status calls about development & testing effort.
  • Participated in defect review meetings with the team members and improve data quality.
  • Preparation of Weekly status reports, Monthly and Quarterly reports for the management.
  • Knowledge transition to the new resources and assessment.

Environment: & Tools Used:Informatica 9.5.x,IDQ 9.6.x, Oracle 11gR2/11g,SQL SERVER 2008 R2/SQL Server 2012, SSMS 2008R2/2012, IBM DB2 9.x, Oracle SQL developer 3.2, Erwin 9.x,RHEL 6.x, IBM AIX Unix V7.x, Autosys 11.x,SSRS & Crystal Reports,Crontab, WinScp/Secure CRT 7.2.x,Quality Center 11.x, Microsoft Visio 2013 and Windows 2010.

Confidential, Chicago, IL

ETL Informatica Developer

Responsibilities:

  • Gathering business, functional requirements and data analysis of source, target and dependent systems.
  • Involved in designing ETL Logical and Physical data modeling.
  • Design of ETL design and developing and preparing Technical design documents and
  • Prepared Data Mapping Document, ETL standards document, Deployment document and Involved in Informatica processes mappings & workflows.
  • Developed Informatica Mappings and PQ/SQL Procedures.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier,Expression, Stored Procedure, Sorter, XML Source Qualifier, SCD Type 1, SCD Type 2 & SCD Type 3conversions and Sequence Generator transformations.
  • Created Complex Mappings to extract data from different flat files which involved slowly Changing Dimensions to Implement Business Logic and capturing the deleted records in the source systems.
  • Developed SQL scripts to load history and incremental data.
  • Designed and developed generic scripts which support requirement changes with minimal design changes and minimal coding.
  • Understood the components of a data quality plan in IDQ. Make informed choices between sources data cleansing and target data cleansing.
  • Established the rules for certifying the quality of data during development in IDQ.
  • Designed and developed UNIX shell scripts for automation.
  • Scheduling Informatica jobs using Autosys.
  • Involved in Report generation out of the data marts and provided assistance to downstream reporting teams and business users.
  • Compared and validated Reports Generation with Cognos.
  • Creating incident management tickets for coordinating with Informatica admins, DBAs in setting up project environment and creation of database objects.
  • Responsible for best practices like naming conventions, Performance tuning and Error Handling.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level andSession Level.
  • Troubleshooting P1 tickets followed SLAs and preparation of RCAs for production issues.
  • Involved in unit testing, pre-production activities and post-implementation shake out activities etc. of the project.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Worked with Session Logs and Workflow Logs for Error handling and troubleshooting.
  • Involved in System Integration testing and supported UAT/QA cycles of the project.
  • Performance tuning of production mappings and review of test cases and test results.
  • Providing support and coordination with Informatica admin in promoting or migrating the code, in maintaining the web services hub.
  • Responsible for migrating code from dev environment to QA and QA to production environment.
  • Prepared migration plan document to move the mappings from development to testing and then to production repositories.
  • Providing technical support to People safe front end application users on adhoc technical queries.
  • Identification of jobs and holding job streams that get impacted due to planned or unplanned weekly maintenance activities.
  • Involved in allocating and coordinating offshore development, testing effort.
  • Preparation of test plans, support SIT & UAT testing.
  • Plan Production deployment and coordinate the same.
  • Attended to Production support issues and Involved in supporting jobs in production environment.
  • Prepare the Daily/Weekly/Monthly reports that are used for Data Quality check automation (Top table Usage) and reports sent to the clients (issue Log/report and Production box comparison report).

Environment: and Tools Used:InformaticaPower Center 9.0.x, IDQ 9.0.x,Oracle 11gR2/11g, Toad 9.7.x,SQL Server 2008R2, IBM DB2 8.x, SQL Developer 2.x,Erwin 7.x,RHEL 5.x/ 6.x, AIX V6.x,PL/SQL, Autosys 11.x, Cognos 10.x,Crontab, Secure CRT 7.0.x/WinScp, Quality Center 11.x & QTP 10.x.

Confidential, Charlotte, NC

ETL Developer

Responsibilities:

  • Requirement gathering & data analysis of source, target systems and creation of low level design for Informatica mappings, workflows.
  • Communication with client personnel regarding business requirements and ongoing changes.
  • Develop technical specification based on Business requirements and obtain sign-off.
  • Experience in various Maintenance requests & Production problems for Data warehouse,Learning & recruiting applications.
  • Involved in Informatica Power Center Client 8.6.xinstallation and configuration.
  • Implementation of mappings in Informatica Power Center involving source as Power Exchange tables, relational tables to load data to relational target tables using real time change data capture.
  • Implemented various Informatica transformations like Sorter, Filter, Router, Expression, Aggregator, Ranker, update strategy, joiner, Lookup, Sequence generator, Transaction Control and Source Qualifier.
  • Development of UNIX shell scripts to run workflows, to maintain dependency between real time workflows and batch workflows.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Createdshell scripts for Autosys jobs.
  • Creation and execution of unit test cases.
  • Involved in development (Informatica, SQR, UNIX, SQL loader& Import/Export) and unit testing.
  • Involved in Migration Request's for SIT/UAT/Production environments.
  • Involved in production support - performance tuning, troubleshooting problems, root cause analysis of production issues.
  • Interacting with client in status meetings, functional discussions & production issues.
  • Involved in UAT release and support. Interacting with business users to address UAT issues.
  • Peer review of LLDs, mappings, workflows, sessions etc.
  • Pre-production release activities like preparing transmittal, review check lists etc.
  • Supported in the database upgrade from 10g to 11g.
  • Involved in allocating and coordinating offshore development, testing effort.
  • Supporting SIT Testing & Co-ordinate with UAT Testers.
  • Perform Post Implementation support.
  • Involved in the review of deliverables and coordinating tasks.
  • 24X7 production support for Data warehouse, Learning & recruiting applications.
  • Knowledge transition to production support team and induction to new joiners.

Environment: and Tools Used: Informatica 8.6.x, Oracle 11g/10g, Toad 9.7.x, PL/SQL Developer 7.x, SQL Server 2008, SSMS 2008,RHEL 5.x, AIX V6.x/V5.x,Autosys 11.x, PeopleSoft 9.x,SQR 9.x, Crontab, SQL loader, Putty, WinScp and Quality Center 9.x.

Confidential

ETL Developer

Responsibilities:

  • Analyzed Business Requirements document and Functional Design document.
  • Involved in analysis, design & development of the project.
  • Understand the client requirements for data transformation.
  • Developed technical specification based on Business requirements.
  • Creation and execution of unit test cases and unit testing of Informatica mappings.
  • Involved in the development of Informatica Mappings for conversion and interfacing.
  • High level and Low level design of Informatica components and Development of mappings, sessions, workflows etc.
  • Implemented various Informatica transformations like Expression, filter, Aggregator, Router, Ranker, update strategy, joiner, Lookup, Sequence generator and Source Qualifier.
  • Used Informatica Workflow Manager to create, execute and Monitor Sessions, Batches and Workflows.
  • Peer review of mappings, workflows, sessions etc. and preparation of review checklists for maintaining the production standards.
  • Worked on SCD Type 1 & Type 2 conversions.
  • Optimized the existing applications at the mapping level, session level and database level for a better performance.
  • Developed shell scripts for scheduling/running ETL jobs using pmcmd command.
  • Archive the data files generated after successful completions of ETL jobs using UNIX shell scripting.
  • Involved in Migration of code to different environments.
  • Preparation of impact analysis based on data analysis of source and target systems.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mapping.
  • Involved in the Performance Tuning of Informatica and databases.
  • Involved in production support - root cause analysis of production issues, bug fixing, and enhancement of mappings.
  • Generated completion messages and status reports using Workflow Manager.
  • Provided business user support in generation of reports and analyzing report data.

Environment: and Tools Used: Informatica Power Center 8.6.x, Oracle 11g/10g/9i, SQL Server2008, MS Access 2007, RHEL 5.x,AIX V5.x, Quest Central 3.x, SecureFX 4.5.x, Putty, WinScp, Quality Center 9.x and Windows XP/Vista.

Hire Now