Sr/informatica Idq/etl Developer Resume
Bloomington, MN
SUMMARY:
- 6 years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center 9.x/8.x/7.x, IDQ, Informatica ETL Developer etc.,
- SDLC: Have good experience in a Full life cycle of Software Development (SDLC) including Business Requirements Gathering & Analysis, System Study, Application Design, Development, Testing, Implementation, System Maintenance and Documentation.
- Experience in working with ORACLE 11g/10g, PL/SQL and tuning.
- Worked and have good knowledge in Agile and Waterfall mode of Software Development methodology.
- Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
- Good understanding of SOA (Data Integration) architecture. Defining the best practices to improve the Data Integration and quality of data in organization
- Have worked in Financial and Investments areas and so have good ability to handle huge and confidential data.
- Experience writing daily batch jobs using UNIX shell scripts, and developing complex UNIX Shell Scripts for automation of ETL.
- Experience in Installing and Configured Informatica Proactive Monitoring Rtam 3.1 and Rulepoint 5.2 for process monitoring and alerting.
- Experience with Teradata 15/14/13 utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming. Expert in performance tuning and dealing with huge volume of data.
- Proficient in implementing complex business rules through different kinds of Informatica transformations, Workflows/Worklets and Mappings/Mapplets.
- Strong knowledge in RDBMS concepts, Data Modeling (Facts and Dimensions, Star/Snow Flake Schemas), Data Migration, Data Cleansing and ETL Processes.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
- Working in Agile space gave an advantage learning different kinds of Test strategies like Functional, Regression and Integration testing.
- Worked with Pre - Session and Post-Session UNIX scripts for automation of ETL jobs using scheduling tools like AutoSys & Control M and Involved in migration/conversion of ETL processes from development to production environment.
- Have experience working Onsite and Offshore which gained excellent communication and instant problem-solving skills.
- Advanced Knowledge of Oracle PL/SQL programming, stored procedures & functions, indexes, views, materialized views, triggers, cursors and tuning the SQL query.
- Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
- Hands on experience identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load (ETL) and strong understanding of OLTP, OLAP concepts.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 10.x/9.x/8.x, Informatica Master Data Management (MDM), Data Quality Tool (IDQ), Informatica Cloud.
Database: Oracle 12c, Oracle 9i/10g/11g, SQL Server 2014/2008/2005 Greenplum and Teradata 14.0
Data Modelling Tools: Erwin 4.1, TOAD, MS Visio, SQL *Loader, Star and Snow Fake Schema.
Scheduling Tools: TWS, Auto Sys, Maestro, Cron Tab, UC4, Control M, Informatica Scheduler
Languages: SQL, TSQL, PL/SQL, C, C++
Scripting Languages: UNIX Shell Scripting, Korn Shell, Bash shell scripting
Operating Systems: Windows, MSDOS, Linux, Unix, Sun Solaris
PROFESSIONAL EXPERIENCE:
Confidential, Bloomington, MN
Sr/Informatica IDQ/ETL Developer
Responsibilities:
- Worked in Agile development methodology environment and Interacted with the users, Business Analysts for collecting, understanding the business requirements.
- Worked on building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Involved in the installation and configuration of Informatica Power Center 10.1 and evaluated Partition concepts in Power Center 10.1
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Design data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
- Installed and Configured Informatica Data Quality and Data Integration HUB on Linux Environment.
- Installed and Configured ILM 6.2 for Data Integration HUB Metadata Archiving.
- Created stored procedures, views, user defined functions and common table expressions.
- Generated underlying data for the reports through SSIS exported cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
- Installed ProActive Monitoring for PowerCenter Operations 3.0 HotFix 1 (PMPC) and RulePoint 6.1 for Informatica PowerCenter for creating alerts about Domain, Repository, Nodes, workflows and session failures.
- Co-ordinated and Participated in Disaster Recovery of Data Quality and Informatica Data Integration HUB
- Assisted with migrating AutoSys jobs into production and Monitor jobs to ensure successful execution.
- Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
- Involved in IDS Services like building Business logics, analyzing the structure and data quality, creating a single view of data etc.
- Assisted customers in troubleshooting issues with their applications in AutoSys and trained the teams to use WCC interface.
- Worked on Informatica cloud for creating source and target objects, developed source to target mappings.
- Involved in importing the existing Power center workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
- Involved in Data integration, monitoring, auditing using Informatica Cloud Designer.
- Worked on Data Synchronization and Data Replication in Informatica cloud.
- Written PL/SQL scripts, created stored procedures and functions and debugged them.
- Created Mapplets, reusable transformations and used them in different mappings. Used Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads and Developed reports based on issues related to the data warehouse.
- Used different Informatica Data Quality transformations in the Developer and Configured match properties match paths, fuzzy match key, fuzzy and exact match columns
- Created profiles, rules, scorecards for data profiling and quality using IDQ.
- Used Informatica Data Quality for addresses and names clean-ups and developed error handling & data quality checks to pull out the right data
- Used IDQ to cleanse and accuracy check the project data, check for duplicate or redundant records.
- Used debugger to test the mapping and fix the bugs and identified the bottlenecks in all levels to tune the performance and Resolved the production support tickets using remedy.
- Developed monitoring scripts in UNIX and moved Data Files to another server by using SCP on UNIX platform.
- Extensively used Teradata Utilities like Fast-Load, Multi-Load, BTEQ & Fast-Export.
- Created Teradata External loader connections such as M Load, Upsert and Update, Fast load while loading data into the target tables in Teradata Database.
- Involved creating the tables in Teradata and setting up the various environments like DEV, SIT, UAT and PROD.
Environment: Informatica Power Center 10.1, Oracle12C, Informatica Cloud, IDS 9.6.1, IDQ9.6.1 Teradata 14.0, SQL Server 2014, Data Integration, Teradata Data Mover, Rulepoint, Autosys Scheduler Tool, Netezza, UNIX, Toad, PL/SQL, SSIS, Power Connect, DB2, Business Objects XI3.5.
Confidential, Phoenix, AZ
Informatica IDQ/ETL Developer
Responsibilities:
- Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
- Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target Star Schema.
- Installed PowerCenter Repository Services, PowerCenter Integration Services, Model Repository Services, Data Integration Services, Analyst Services, Content Management Services, Reporting Services & Web Services Hub Services on Informatica 9.1.0.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica Power Center/Data Quality (IDQ) and proposed ETL strategies based on requirements.
- Performed thorough data profiling to understand the quality of source data and to find data issues using IDQ.
- Responsible for stopping and starting Informatica, B2B DX, Data Integration Hub servers and services to support monthly windows patching events.
- Involved in massive data profiling using IDQ prior to data staging.
- Created Design Specification Documents including source to target mappings.
- Responsible for performance tuning ETL process to optimize load and query Performance
- Extensively involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures.
- Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
- Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
- Written Pre and Post Session SQL commands (DDL & DML) to drop and recreate the indexes on data warehouse.
- Developed process for Teradata using Shell Scripting and RDBMS utilities such as MLoad, Fast Load (Teradata)
- Extensively used pmcmd commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files.
- Partially involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse
- Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
- Extracted data from various source systems like Oracle, SQL Server, XML and flat files and loaded into relational data warehouse and flat files
- Written wrapper scripts for scheduling jobs in Autosys, complete job schedule are maintained in the Custom oracle tables, which enabled restart ability and better performance.
- Involved writing BTEQ scripts for validation & testing of the sessions, data integrity between source and target databases and for report generation.
- Migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
- Identified the bottlenecks and improved overall performance of the sessions
- Created Dimensions and Fact tables for the data mart and also implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
- Experience in Scheduling Informatica sessions for automation of loads in Autosys.
- Provided production support by monitoring the processes running daily
Environment: Informatica Power Center 10.x/9.x, IDQ, Data Integration, Erwin, Oracle 11g/10g, PL/SQL, SQL*Loader, TOAD, MS SQL Server 2012/2008, Autosys.