- Over 7+ years of extensive experience in using different versions of Informatica (9.x, 8.x and 7.x).
- Expertise in all aspects of Software Development Life Cycle (SDLC).
- Extensively worked in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading.
- Strong experience with Informatica tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository.
- Designed and developed complex mappings to move data from multiple sources into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, and Update Strategyfrom varied transformations logics in Informatica.
- Strong knowledge of writing simple and Complex Queries in various databases like Oracle, Teradata, My Sql, DB2 and SQL Server.
- Experience working with Netezza database.
- Experience with working on Hadoop and ETL tool Informatica power Centre
- Experience in Data warehousing and Business Intelligence in analysis, design, development, testing and implementation of the reporting and ETL components.
- Interacted with end-users and functional analysts to identify and develop Business Specification Documents (BSD) / (STM) and transform it into technical requirements.
- Hands on experience with Teradata utilities like BTEQ, Fast Export, Fast Load, and Multi Load to export and load data to/from different source systems including flat files.
- Having strong hands on experience in extraction of the data from various source systems like DB2, Oracle, My Sql, Flat Files and XML.
- Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like Oracle, DB2, flat files into the staging area and Design ETL processes that span multiple projects
- Worked with PMCMD to interact with Informatica Server from command mode and execute the Shell scripts.
- Experience with all facets of the application solution delivery lifecycle management, including requirements definition, design, development, and testing phases.
- Experienced in developing complex mappings using informatica and Change data capture (CDC).
- Experience in SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS).
- Experience in Performance Tuning and Debugging of existing ETL processes and Sql Scripts.
- Experience with Scheduling Tools Auto Sys, Control M, Maestro & Cron Job.
- Expertise in handling Administration tasks like bursting and scheduling the reports like cognos, Tableau.
- Extensive experience Writing UNIX Shell Scripting for various purposes.
- Experience working with DVO tools.
Sr. ETL Informatica Developer/Teradata Developer
Confidential - Murray, UT
- Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
- Responsible for Impact Analysis, upstream/downstream impacts.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
- Applied the concept of Change Data Capture (CDC) and imported the source from Legacy systems.
- Involved in Deployment and Administration of SSIS packages with Business Intelligence development studio.
- Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
- Extracted data from various heterogeneous sources like Oracle, Sybase, SFDC, Flat Files and COBOL (VSAM) using Informatica Power center and loaded data in target database DB2.
- Extracted Data from Hadoop and Modified Data according to Business requirement and load into Hadoop.
- Build frame work for inbound/Outbound files getting in to Facets System.
- Used the application to setup the group and subscriber data in theFACETS application.
- Modified SOQL (Sales force object Query Language) for Sales Force target at session level.
- Developed mappings to load Fact and Dimension tables, SCD Type 1&Type 2 dimensions and Incremental loading and unit tested the mappings.
- Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
- Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
- Involved in change data capture (CDC) ETL process.
- Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
- Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
- Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Integrated the data into centralized location. Used migration, redesign and Evaluation approaches.
- Involved in creation and maintenance of BMC Control-M jobs that submit Cognos cube build scripts that perform automatic publishing of Power Play operational reporting cubes providing version control.
- Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
- Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
- Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
- Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
- Managed postproduction issues and delivered all assignments/projects within specified time lines.
- Experience in Agile methodology and worked with offshore team.
Environment: Informatica Power Center 9.5.1, Oracle 11g, DB2, Teradata, Flat Files, Erwin 4.1.2, Sql Assistant, Cognos, Facets, Toad, Winscp, Putty, Autosys, UNIX, Agile.
Sr.ETL Informatica Developer/Teradata Developer
Confidential - Herndon, VA
- Involved in Walk-through meetings with Business Analysts to understand the Mapping Document.
- Involved in Understanding logical and physical data models that capture current state/future state data elements and data flows using Erwin.
- Used Different Transformations like Lookup, Joiner, Rank, Expression, Java, Stored Procedure, Update Strategy, and Source Qualifier Transformations in the Informatica Designer.
- Used Informatica Power Ceneter for retrieving data from Mainframe system.
- Installed and configured Informatica, DVO tools.
- Importing CDC tables in Power Center Designer.
- High proficiency with Power Center and involved in utilizing Informatica product effectively to move Oracle, DB2, My Sql and XML, flat-file datato Teradata.
- Used Informatica as an ETL tool to pull the data from various up-streams databases and load the data into Netezza DB.
- Create ETL processes for both Initial & Delta Loads.
- Created ETL process for Slowly Changing Dimensions(SCD) Type 1 & Type2.
- Efficiently used Teradata Utilities - BTEQ, M-Load, F-Load, T-Pump & TPT.
- Involved in tuning BTEQs.
- Extensively used ETL to load data using Power Center / Power Exchange from source systems like Mainframe files into staging tables and load the data into the target database Oracle.
- Developed complex mappings using informatica and Change data capture (CDC).
- Developed SSIS Templates which can be used to develop SSIS Packages in such a way that they can be dynamically deployed into Dev, Test and Production Environments.
- Implemented and tested analytical solutions in Hadoop.
- Designed and developed various SSIS packages (ETL) to extract and transform data and involved in Scheduling SSIS Packages.
- Created various objects in SalesForce.
- Have been involved in designing & creating hive tables to upload data in Hadoop and process like merging, sorting and creating, joining tables.
- Worked on Informatica Cloud to create Source /Target SFDC connections, monitor, synchronize the data in SFDC.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Created Mapplets and used them in different Mappings.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Performed data profiling and analysis of various objects in SalesForce (SFDC) and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
- Created Data Source Connections in Cognos Connection for new development.
- Developed Transformer Models and publishing the Cubes to operational reporting Cognos Connection to be used for multidimensional analysis.
- Performance tuning of the Informatica mappings using various options available in Informatica by finding Bottle Necks.
- Used DVO to validate the data moving from source to Target.
- Involved in automating SSIS Packages using SQL Server Agent Jobs. Created an ETL summary report which consists of the summary information for all the loading activities done each day and month.
- Maintaining Error logging using Exceptional Handling in SSIS.
- Creating reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries and deploying SSRS reports into Report Server.
- Performance tuning using Informatica partitioning.
- Involved in Data Base Tuning
- Written UNIX Shell Scripts for various purposes in the project.
- Worked in Agile methodology.
- Managed Change control implementation and coordinating daily, monthly releases.
Environment: Informatica Power Center 10.0.1/9.6.1, Oracle 10g/9i, DB2, Flat File, XML, Netezza, Teradata, Tableau, Cognos, Toad, Hadoop, Agile, DVO, SQL Assistant, Putty, Unix, Windows.
Confidential - San Antonio, TX
- Worked with Business Users in requirements gathering and understanding requirements Specification documents.
- Created Technical System Design documents, Low Level Design documents.
- Worked extensively on Informatica client tools such as Designer, Workflow manager, Workflow Monitor.
- Worked on Informatica transformations such as Source Qualifier, Filter, Update Strategy, Union, Sorter, Expression, Aggregator, Joiner, Connected & Unconnected Lookup transformation, Sequence Generator, Transaction Control etc.
- Developed Informatica mappings/sessions/workflows - Extensively used Dynamic/Persistent Lookup cache techniques for minimizing the load window.
- Developed re-usable transformations, Mapplets following the Informatica best practices.
- Involved in designing ETL data load strategy for SCD type 2, Dimensions and Fact tables, ETL reject handling, ETL restart recovery logic.
- Extracted source data from Flat Files, XML, and Oracle and loaded in to DB2 relational warehouse.
- Involved in client interaction, analyzing issues with the existing requirements, proposing solutions and implementing the same.
- Involved in performance tuning SQL and Informatica mappings/sessions by session partitioning techniques.
- Prepared Migration documents for migrating Informatica objects (mappings/sessions/work lets/workflows) from the Dev to QA to Production Environments.
- Created Unit test case templates (standardized formats) and experienced in doing Unit Testing.
- Involved in doing the effort analysis for new work requests/enhancements.
Environment: Informatica Power Centre 9.1.1 / 9.5.1, Oracle 11g, TOAD, SQL Server 2005, DB2, Flat File, XML, UNIX, Windows, Erwin, Agile.
Sr. ETL Informatica Developer
Confidential - Covington, KY
- Involved in Data transfer from OLTP systems forming the extracted sources.
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
- Analyzed the sources, transformed the data, mapped the data and loaded the data into targets using Power Center Designer.
- Designed and developed Oracle PL/SQL Procedures.
- Developed UNIX Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
- Participated in Modelling review sessions.
- Experienced in implementing CDC.
- Experienced in implementing SCD Typ1 / Type2.