We provide IT Staff Augmentation Services!

Sr Informatica Idq/ Etl Developer Resume

2.00/5 (Submit Your Rating)

New York, NY

SUMMARY

  • Around 8 years of extensive experience with Informatica Powercenter in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Powercenter 10.x /9.x/8.x/7.x, IDQ, Informatica ETL Developer etc.,
  • Databases of experience using Oracle, DB2, MS SQL Server, Teradata, Teradata SQL Assistant, and MYSQL.
  • SDLC: Has good experience in a Full life cycle of Software Development (SDLC) including Business Requirements Gathering & Analysis, System Study, Application Design, Development, Testing, Implementation, System Maintenance and Documentation.
  • Experience in working with ORACLE 12C/11g/10g, PL/SQL and tuning.
  • Worked and has good knowledge in Agile and Waterfall mode of Software Development methodology.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
  • Has worked in Financial and Investments areas and so has good ability to handle huge and confidential data.
  • Experience writing daily batch jobs using UNIX shell scripts, and developing complex UNIX Shell Scripts for automation of ETL.
  • Experience with Teradata 15/14/13 utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming. Expert in performance tuning and dealing with huge volume of data.
  • Proficient in implementing complex business rules through different kinds of Informatica transformations, Workflows/Worklets and Mappings/Mapplets.
  • Knowledge in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, MapR, HBase, Oozie, Hive, Sqoop, Pig, Flume, Apache Spark,Kafka.
  • Strong knowledge in RDBMS concepts, Data Modeling (Facts and Dimensions, Star/Snow Flake Schemas), Data Migration, Data Cleansing and ETL Processes.
  • Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Working in Agile space gave an advantage learning different kinds of Test strategies like Functional, Regression and Integration testing.
  • Has experience working Onsite and Offshore which gained excellent communication and instant problem - solving skills.
  • Advanced Knowledge of Oracle PL/SQL programming, stored procedures & functions, indexes, views, materialized views, triggers, cursors and tuning teh SQL query.
  • Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
  • Hands on experience identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load (ETL) and strong understanding of OLTP, OLAP concepts.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Hands on experience in resolving teh production issues in 24/7 environment.
  • Excellent problem solving, analytical, technical, interpersonal and communication skills with strong leadership abilities, motivated and adaptive with teh ability to grasp things quickly. Extremely diligent strong team player with an ability to take new roles.

TECHNICAL SKILLS

Operating System: UNIX, Linux, Windows

Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)JIRA (8.5.4 and 8.0), Service Now.

Specialist Applications & Software: Informatica PowerCenter 10.3/9.5/9.1/8.6/8. , Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), SSIS, Salesforce, Axon, EDC DataStage, etc.

Data Modeling (working knowledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Software Development Methodology: Agile, Water fall.

Scheduling tools: Informatica Scheduler, Tidal Enterprise Scheduler, CA Scheduler (Autosys), ESP, Maestro, Control-M.

Others (working knowledge on some): OBIEE RPD creation, OBIEE, ECM, Informatica Data transformation XMAP, DAC, Snowflake, Rational Clear Case, WS-FTP Pro, DTD.

Programming and Scripting: C, C++, Java, .Net, Perl Scripting, PostgreSQL, Shell Scripting, XSLT, PL/SQL, Denodo.

RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008 , MS Access 7.0/2000.

Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, Salesforce, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

PROFESSIONAL EXPERIENCE

Confidential, New York, NY

Sr Informatica IDQ/ ETL Developer

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated teh data into teh target Star Schema.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded teh process of data flow from source system to data warehouse.
  • Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle RDBMS in accordance with requirements and specifications.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica Powercenter/Data Quality (IDQ) and proposed ETL strategies based on requirements.
  • Responsible for teh data integration into SalesForce.com using Informatica PowerCenter and Informatica Cloud
  • Analysis and design processes for provisioning downloading information in tables DWH Credit Risk by Informatica PowerCenter 10.3 / 9.5.1.
  • Performed thorough data profiling to understand teh quality of source data and to find data issues using IDQ.
  • Worked on analyzing Hadoop clusters using Big Data Analytic tools including Map Reduce, Pig and Hive.
  • Involved in implementing teh Land Process of loading teh customer Data Set into Informatica MDM from various source systems.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files
  • Worked extensively with developing ODI packages & load plans and generating ODI scenarios.
  • Involved in massive data profiling using IDQ prior to data staging.
  • Worked on extraction, loading, transformation using ODI from Oracle and FLAT FILES.
  • Created Design Specification Documents including source to target mappings.
  • Responsible for performance tuning ETL process to optimize load and query Performance Extensively involved in coding of teh Business Rules through PL/SQL using teh Functions, Cursors and Stored Procedures.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Data migration from on premise databases to AWS cloud Data lake.
  • Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
  • Sourced data form RDS and AWS S3 bucket and populated in Teradata target.
  • Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
  • Used Teradata Utilities BTEQ, Fast Load, T-pump, Multi Load utilities for loading bulk data.
  • Extensively used Fast Export to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities and queries against Teradata database. Created proper PI taking into consideration of both planned access of data and even distribution of data across all teh available AMPS.
  • Written Pre and Post Session SQL commands (DDL & DML) to drop and recreate teh indexes on data warehouse.
  • Developed process for Teradata using Shell Scripting and RDBMS utilities such as MLoad, Fast Load (Teradata)
  • Extensively used pmcmd commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files.
  • Partially involved in writing teh UNIX Shell Scripts, which triggers teh workflows to run in a particular order as a part of teh daily loading into teh Warehouse
  • Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
  • Extracted data from various source systems like Oracle, SQL Server, XML and flat files and loaded into relational data warehouse and flat files
  • Involved writing BTEQ scripts for validation & testing of teh sessions, data integrity between source and target databases and for report generation.
  • Migrated teh codes from Dev to Test to Prod environment. Wrote down teh techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Identified teh bottlenecks and improved overall performance of teh sessions Created Dimensions and Fact tables for teh data mart and also implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
  • Experience in Scheduling Informatica sessions for automation of loads in Autosys.
  • Provided production support by monitoring teh processes running daily

Environment: Informatica Powercenter 10.x/9.x, IDQ, Erwin, MDM, AWS, Oracle Data Integration (ODI), Oracle 11g/10g, PL/SQL, SQL*Loader, TOAD, MS SQL Server 2012/2008, Autosys.

Confidential, Parsippany, NJ

Informatica IDQ / ETL Developer

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated teh data into teh target Star Schema.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded teh process of data flow from source system to data warehouse.
  • Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle RDBMS in accordance with requirements and specifications.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica Powercenter/Data Quality (IDQ) and proposed ETL strategies based on requirements.
  • Performed thorough data profiling to understand teh quality of source data and to find data issues using IDQ.
  • Involved in massive data profiling using IDQ prior to data staging.
  • Created Design Specification Documents including source to target mappings.
  • Responsible for performance tuning ETL process to optimize load and query Performance Extensively involved in coding of teh Business Rules through PL/SQL using teh Functions, Cursors and Stored Procedures.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
  • Extracted data from Flat files, SQL Server and Oracle and loaded them into Teradata.
  • Written Pre and Post Session SQL commands (DDL & DML) to drop and recreate teh indexes on data warehouse.
  • Developed process for Teradata using Shell Scripting and RDBMS utilities such as MLoad, Fast Load (Teradata)
  • Extensively used pmcmd commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files.
  • Partially involved in writing teh UNIX Shell Scripts, which triggers teh workflows to run in a particular order as a part of teh daily loading into teh Warehouse
  • Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
  • Extracted data from various source systems like Oracle, SQL Server, XML and flat files and loaded into relational data warehouse and flat files
  • Involved writing BTEQ scripts for validation & testing of teh sessions, data integrity between source and target databases and for report generation.
  • Migrated teh codes from Dev to Test to Prod environment. Wrote down teh techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Identified teh bottlenecks and improved overall performance of teh sessions
  • Created Dimensions and Fact tables for teh data mart and also implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
  • Experience in Scheduling Informatica sessions for automation of loads in Autosys.
  • Provided production support by monitoring teh processes running daily

Environment: Informatica Powercenter 10.x/9.x, IDQ, Erwin, Oracle 11g/10g, PL/SQL, SQL*Loader, TOAD, MS SQL Server 2012/2008, Autosys.

Confidential, Herndon, VA

Sr. Informatica / ETL Developer

Responsibilities:

  • Worked in Agile development methodology environment and Interacted with teh users, Business Analysts for collecting, understanding teh business requirements.
  • Worked on building teh ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Involved in teh installation and configuration of Informatica Powercenter 10.1 and evaluated Partition concepts in Powercenter 10.1
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created stored procedures, views, user defined functions and common table expressions. Generated underlying data for teh reports through SSIS exported cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Managed teh Metadata associated with teh ETL processes used to populate teh Data Warehouse.
  • Involved in IDS Services like building Business logics, analyzing teh structure and data quality, creating a single view of data etc. Worked on Informatica cloud for creating source and target objects, developed source to target mappings.
  • Involved in implementing teh Land Process of loading teh customer/product Data Set into Informatica MDM
  • Worked on data cleansing and standardization using teh cleanse functions in Informatica MDM.
  • Configured match rule set property by enabling search by rules in MDM per Business Rules.
  • Worked on Informatica cloud for creating source and target objects, developed source to target mappings.
  • Loaded data into MDM landing table for MDM base loads and Match and Merge.
  • Importing & exporting database using SQL Server Integrations Services (SSIS) and Data Transformation Services (DTS Packages).
  • Knowledge in performing teh customizations to teh ODI Knowledge Modules as part of tuning teh Knowledge Module Code based on teh requirements.
  • Importing data from Oracle and MySQL to Hadoop using Sqoop.
  • Consolidated data from different systems to load Constellation Planning Data Warehouse using ODI interfaces and procedures.
  • Involved in writing teh shell scripts for exporting log files to Hadoop cluster through automated process.
  • Develop packages for teh ODI objects and scheduled scenarios using them.
  • Responsible for creating complex mappings according to business requirements, which can are scheduled through ODI Scheduler.
  • Created ODI Packages, Jobs of various complexities and automated process data flow.
  • Involved in migrating data from Oracle/PostgreSQL to AWS S3 & GCS cloud storages and then loading to Redshift & Big query tables using Informatica tasks and Unix/Python scripts wherever necessary.
  • Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modeling using ERWIN data modelling tools.
  • Stored data from SQL Server database into Hadoop clusters which are set up in AWS EMR. Involved in all teh phases of Migration from DTS to SSIS packages.
  • Involved in importing teh existing Powercenter workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
  • Designed, developed ODI for ETL Project, gathering requirement specification documents and presenting and identifying data sources.
  • Installed, Maintained and Documented teh ODI setup on multiple environments.
  • Migrated and converted CSV and flat files from multiple sources to ORACLE 11g.
  • Involved in migrating data from Oracle/PostgreSQL to AWS S3 & GCS cloud storages and then loading to Redshift & Big query tables using Informatica tasks and Unix/Python scripts wherever necessary.
  • Involved in Data integration, monitoring, auditing using Informatica Cloud Designer.
  • Worked on Data Synchronization and Data Replication in Informatica cloud.
  • Written PL/SQL scripts created stored procedures and functions and debugged them.
  • Sourced data form RDS and AWS S3 bucket and populated in Teradata target.
  • Created Mapplets, reusable transformations and used them in different mappings. Used Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads and Developed reports based on issues related to teh data warehouse.
  • Used different Informatica Data Quality transformations in teh Developer and Configured match properties match paths, fuzzy match key, fuzzy and exact match columns Created profiles, rules, scorecards for data profiling and quality using IDQ.
  • Used Informatica Data Quality for addresses and names clean-ups and developed error handling & data quality checks to pull out teh right data
  • Used IDQ to cleanse and accuracy check teh project data, check for duplicate or redundant records.
  • Used debugger to test teh mapping and fix teh bugs and identified teh bottlenecks in all levels to tune teh performance and resolved teh production support tickets using remedy.
  • Transformed bulk amount of data from various sources to Teradata database by using BTEQ, MLOAD and TPUMP scripts.
  • Used Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP & TPT and experience in creating BTEQ scripts.
  • Transferred data using Teradata utilities like SQL Assistant, Fast Export and Fast Load.
  • Developed monitoring scripts in UNIX and moved Data Files to another server by using SCP on UNIX platform.
  • Extensively used Teradata Utilities like Fast-Load, Multi-Load, BTEQ & Fast-Export.
  • Created Teradata External loader connections such as M Load, Upsert and Update, Fast load while loading data into teh target tables in Teradata Database.
  • Involved creating teh tables in Teradata and setting up teh various environments like DEV, SIT, UAT and PROD.

Environment: Informatica Powercenter 10.1, Oracle12C, MDM, Oracle Data Integration (ODI), AWS, Informatica Cloud, IDS 9.6.1, IDQ9.6.1 Teradata 14.0, SQL Server 2014, Teradata Data Mover, Autosys Scheduler Tool, Netezza, UNIX, Toad, PL/SQL, SSIS, Power Connect, DB2, Business Objects XI3.5.

Confidential, Denver, CO

ETL Informatica Developer

Responsibilities:

  • Involved in teh requirements definition and analysis in support of Data Warehousing efforts Worked on ETL design and development, creation of teh Informatica source to target mappings, sessions and workflows to implement teh Business Logic.
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
  • Involved in Data Profiling & Data Analysis on heterogeneous Database sources like Oracle, flat files.
  • Extensively worked on Informatica Data Quality (IDQ 9.6.1) for data analysis, data cleansing, data validation, data profiling and matching/removing duplicate data.
  • Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc. for data loads and data cleansing.
  • Preparation of technical specification for teh development of Extraction, Transformation and Loading data into various stage tables.
  • Used teh Integration service in PowerCenter 9.6.1 to start multiple instances of teh same workflow.
  • Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
  • Created Data Breakpoints and Error Breakpoints for debugging teh mappings using Debugger Wizard.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Validated and tested teh mappings using Informatica Debugger, session logs and workflow logs.
  • Worked on migrating existing PL/SQL packages, stored procedures, triggers and functions to Informatica Powercenter.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
  • Worked on Autosys Scheduler to automate teh Workflows.
  • Tested all teh mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Environment: Informatica PowerCenter 9.6.1, Informatica Data Quality (IDQ 8.6.1), Informatica Data Quality (IDQ 8.6.1), SQL Server, Oracle 11g, PL/SQL, Flat files, MySQL, WinSCP, Notepad++, Toad, Quest Central, UNIX scripting, Windows NT.

Confidential

Informatica Administrator

Responsibilities:

  • Installed and configured Informatica Powercenter 9.1.
  • Configured Load balancing and Grid in Informatica.
  • Creation and maintenance of Informatica users and privileges.
  • Created Native Groups, Native Users, roles, and privileges and assigned them to each user group.
  • Developed a standard ETL framework to enable teh reusability of similar logic across teh board. Involved in System Documentation of Dataflow and methodology.
  • Installation and configuration of B2B DX, MFT and troubleshoot teh issues.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files and load into DataMart using teh Powercenter.
  • Written pre session and post session scripts in mappings.
  • Created deployment groups and assigned teh permissions to teh deployment group.
  • Created Informatica Folders and assigning teh permissions to teh Folders.
  • Created OS profiles for teh user which used to run teh applications.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev to Test and Test to Stage and Stage to Prod environments.
  • Coordinated with UNIX and Database Administrators for creating OS profiles and file systems.
  • Handled outages when their is any maintenance in UNIX and DATA BASE.
  • Collaboratively worked with Application Support Team, Network, Database Teams, and UNIX teams.
  • Bouncing Activities or Restarting teh Service in Network Changes or in any maintenance and Communicating to Business.
  • Extensively used various performance tuning techniques to improve teh job performance.
  • Involved in unit testing and Prepared Test cases
  • Requirement gathering, and functional & technical specifications, Architect, Analysis, design logical and physical conceptual and testing.
  • Involved in Extraction, Transformation and Load (ETL) Process.
  • Created source and target definitions in Informatica PowerCenter - Designer.
  • Created mappings according to teh specification.
  • After executing teh workflows, monitored them in Workflow Monitor.
  • Wrote design documentation for theETL maps.
  • Unit tested teh maps by running SQL queries on source and target.

Environment: Informatica PowerCenter 7.1, Flat files, XML Files, Oracle 9i, PL/SQL, Toad 7.6, Windows 2000, and UNIX.

We'd love your feedback!