We provide IT Staff Augmentation Services!

Sr.informatica Etl/idq Developer Resume

5.00/5 (Submit Your Rating)

Derry Township, PA

SUMMARY

  • 10+ years of IT industry with substantial experience in providing information management consulting services on Informatica Power Center, Informatica Data Quality, Data Integration and Business Intelligence Applications.
  • Developed applications to individual client standards.
  • Submitted weekly status reports to Application Development Managers.
  • Over 10+ years of experience in Informatica PowerCenter and 4 years using Informatica Data Quality.
  • Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
  • Strong experience with Informatica tools using real - time CDC (change data capture) and MD5.
  • Experience in integration of various data sources like Oracle, Teradata, Netezza, Mainframes, SQL server, XML and Flat files.
  • Very strong in Data Warehousing Concepts like Dimensions Type I, II and III, Facts, Surrogate keys, ODS, Staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
  • Realistic understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
  • Superior SQL Skills and ability to write and interpret complex SQL statements and also mentor developers on SQL Optimization.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata. Good understanding of Views, Synonyms, Indexes, Partitioning, Database Joins, Stats and Optimization.
  • Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
  • Experience in tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
  • Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger projects. Heavily created mapples, user defined functions, reusable transformations, look-ups.
  • Technical expertise in designing technical processes by using Internal Modeling & working with Analytical Teams to create design specifications; successfully defined & designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
  • Strong expertise in using ETL Tool Informatica Power Center 8.x /9.x/10.x (Designer, Workflow Manager, Repository Manager, Data Quality (IDQ) and ETL concepts.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Worked with various transformations like Normalizer, expression, rank, filter, group, aggregator, lookup, joiner, sequence generator, sorter, sql, stored procedure, Update strategy and Source Qualifier.
  • Experienced in using advanced concepts of Informatica like Push down Optimization (PDO).
  • Designing and developing Informatica mappings including Type-I, Type-II and Type-III slowly changing dimensions (SCD).
  • Experienced in scheduling Informatica jobs using scheduling tools like DAC, AutoSys and Control-M.
  • Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Triggers, Views and Materialized Views.
  • Involved in the development of Informatica mappings and also tuned for better Performance.
  • Good hands on experience in writing UNIX shell scripts to process Data Warehouse jobs.
  • Good command on Database as Oracle 11g/10g/9i/8i, DB2, SQL Server 2008, Netezza and MS Access 2003.
  • Experience in all phases of Data Warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing, and documentation.
  • Excellent interpersonal and communication skills, and experienced in working with senior level managers, business users, and developers across multiple disciplines.
  • Executed software projects for Communications & Media, Manufacturing & Hi-Tech, Banking and financial services.
  • Good communication skills, interpersonal skills, self-motivated, quick learner and team player.
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members
  • Excellent communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.
  • Exposure in Waterfall and Agile methodologies with Scum.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 10.1/9.6/9.1/8.6/8.1 , Informatica Data Quality 10.1/9.6.1/9.1 , Metadata Manager 9.6.1, Data Mart, OLAP, OLTP and ERWIN 4.x/3.x.

Programming Languages: Unix Shell Scripting, SQL, PL/SQL.

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin and Oracle Designer.

Methodology: Agile, SCRUM and Waterfall.

Databases & Tools: Oracle 11g/10g/9i, SQL Server 2012/2008/2005 , Netezza, SQL*Plus, SQL*Loader and TOAD.

Operating Systems: Windows, UNIX and Linux.

Reporting Tools: OBIEE, Tableau.

Scheduling Tools: UC4, Control-m, Autosys, DAC.

PROFESSIONAL EXPERIENCE

Confidential, Derry Township, PA

Sr.Informatica ETL/IDQ Developer

Responsibilities:

  • Responsible for design and development of multiple projects leveraging Informatica Power Center ETL tool, Data Quality and Tableau reporting tools.
  • Understand the data quality rules defined by the business/functional teams and propose the optimization of these rules if applicable, then design and develop these rules with IDQ, including development and unit testing.
  • Working with Business Analyst, get the Functional Specification Document and prepared Technical Design Document.
  • Performed Data Profiling using Informatica Data Quality Analyst(IDE) Tool.
  • Analysed heterogeneous sources, applied transformations and moved to target according to the business requirements
  • Provided Golden Set of Records using Grouping, Matching, and Consolidation to Data Steward Team.
  • Designed various mappings and Mapplets using different Transformations Techniques such as Key Generator, Match, Labeller, Case Converter, Standardizer, and Address Validator.
  • Responsible for Impact Analysis, upstream/downstream impacts.
  • Analyse business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
  • Worked on Informatica- Source Analyser, Warehouse Designer, Mapping Designer &Mapplet, and Transformation Developer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Queried the Target database using Teradata SQL and BTEQ for validation.
  • Worked extensively with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Work let Designer.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations.
  • Expertise in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
  • Responsible for performing SQL query optimization using Hints, Indexes and Explain plan.
  • Managed post production issues and delivered all assignments/projects within specified time lines.

Environment: Informatica Power Center 10.1/9.6.1, Informatica Data Quality 10.1/9.6.1, Metadata Manager 10.1, Power Exchange 9.5, Oracle 11g, IBM DB2, SQL server 2012, Teradata 15, Flat Files, Tableau, Erwin 4.1.2, Toad, SVN, WinScp, HP ALM, UC4, UNIX, Putty, Shell Scripting.

Confidential, Dearborn, MI

Sr.Informatica Developer

Responsibilities:

  • Responsible for design, development and maintenance of Data Marts including Sales, Policy, Customer Reporting and Claims.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Data Cleansed and Address Validated through Address Doctor for the Facets System.
  • Used Case Converter for Contact Data such as First Name and Last Name .
  • Extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems
  • Defined Business rules in Informatica Data Quality (IDQ) to evaluate quality of data by creating cleanse processes to monitor compliance with standards and also identified areas for data quality gaps and assist in resolving data quality issues.
  • Design and developed Exception Handling, data standardization procedures and quality assurance controls
  • Exported Mappings from IDQ to Informatica Power Center
  • Responsible for Data Warehouse Architecture, ETL and coding standards.
  • Developed Capacity Planning/Architecture/ Strategic Roadmaps/Implementing standards.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Mapplets and used them in different Mappings.
  • Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
  • Done extensive bulk loading into the target using Oracle SQL Loader.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Managed Scheduling of Tasks to run any time without nay operator intervention.
  • Leveraged workflow manager for session management, database connection management and scheduling of jobs.
  • Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
  • Experienced in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
  • Delivered all the projects/assignments within specified timelines.

Environment: Informatica Power Center 9.5, Informatica Data Quality 9.5, Oracle 10g, Power Exchange, Flat files, MS SQL server 2008, HP Quality Control, Winscp, Autosys, MS. Visio, Shell Scripting, UNIX.

Confidential, Columbia, MO

ETL/Informatica Developer

Responsibilities:

  • Analyzed existing system prepared and presented an Impact Analysis Document.
  • Actively participated in the design, development and implementations of the Enterprise Data Warehouse (EDW) process and Data mart.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Configured the Repositories.
  • Developed various mappings by using reusable transformations.
  • Executed the workflow using pmcmd command in UNIX.
  • Improved the mapping performance using SQL overrides.
  • Created Mapplets and used them in different mappings.
  • Used Debugger to test the data flow and fix the mappings.
  • Implemented Error data validations using Error handling strategy techniques.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Implemented all the ETL Architecture standards for the new technology Teradata.
  • Post-Production support to monitor jobs in new environment.
  • Developed mappings, sessions and workflows for the new ETL process followed with Quality standards.
  • Responsible for QA migration and Production Deployment.
  • Used various transformations for data Manipulation.
  • Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy and Normalizer.
  • Developed and implemented the long term IT goals and strategies.
  • Participated in the rotation of 24/7 support.

Environment: Informatica Power Center 9.1, Oracle 10g, Flat files, SQL server 2008, Analytics Server 3.1.2, Business Objects 6.0, Shell / Perl Script, Sun Solaris OS, PL/SQL, Toad 7.0, Erwin 3.5.2

Confidential, Austin, TX

ETL Developer

Responsibilities:

  • Developing Informatica mappings and shell scripts for loading trading and clearing data from various clients.
  • Actively participated in a team in the logical and physical design of the data warehouse.
  • Closely associated with data architect in resolving the data issues.
  • Developed the Informatica mappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
  • Involved in creating stored procedure to support recovery.
  • Responsible for working closely with the Informatica administrator to migrate Source and Target definitions, Mappings, Workflows, and Flat Files from development environment to the production environment.
  • Extensively used the Lookup and Update Strategy Transformations for implementing the Slowly Changing Dimensions.
  • Used different tasks like Email, Command task.
  • Worked with the Informatica Administrator in migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
  • Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
  • Involved in the mirroring of the staging environment to production.
  • Created and reviewed the Design and Code review Templates.
  • As a part of the testing team, Involved in conducting the Unit tests and System tests.
  • Scheduling jobs using Autosys to automate the Informatica Sessions.
  • Optimizing the Autosys batch flow.
  • Developing control files, Stored Procedures to manipulate and load the data into Oracle database
  • Optimizing queries using SQL Navigator

Environment: Informatica Power Center 8.6, XML Files, Flat Files, Windows NT, Sun Solaris, Shell Scripts, Oracle 9i, PL/SQL, SQL Loader, SQL Server 2005, Skybot, DB2

Confidential

Data Warehousing Consultant

Responsibilities:

  • Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.
  • Extracting the data from various sources of input & loading into Oracle Data Warehouse.
  • Loading the Data from the tables into the OLAP application and further aggregate to higher levels for analysis.
  • Creating temporary repository for already migrated database for system analysis.
  • Writing batch programs and database triggers at staging area for population of warehouse.
  • Creating sessions and batches.
  • Successfully implemented Data Mart project from SQL Server/DTS based ETL to Oracle/Informatica.
  • Worked with users and business for Data questions and providing Ad hoc reports.
  • Developed data conversion, integration, loading and verification specifications and design of a Mapping Specification and creating multidimensional models.
  • Creating Sessions and batch management and Performance Tuning and initial testing followed by the volume testing Resolve technical issues with consultants and vendors.

Environment: Informatica 7.1, Oracle 9i, PL/SQL, Erwin 3.5.2, JIRA, UNIX.

We'd love your feedback!