Informatica Developer Resume
Fort Worth, Tx
PROFESSIONAL SUMMARY:
- Over 8 years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center 9.x/8.x/7.x, IDQ, Informatica ETL Developer etc.,
- Databases: 6 years of experience using Oracle, DB2, MS SQL Server, Teradata, Teradata SQL Assistant, MYSQL.
- SDLC: Have good experience in a Full life cycle of Software Development (SDLC) including Business Requirements Gathering & Analysis, System Study, Application Design, Development, Testing, Implementation, System Maintenance and Documentation.
- Experience in working with ORACLE 11g/10g, PL/SQL and tuning.
- Worked and have good knowledge in Agile and Waterfall mode of Software Development methodology.
- Have worked in Financial and Investments areas and so have good ability to handle huge and confidential data.
- Experience with Teradata 15/14/13 utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming. Expert in performance tuning and dealing with huge volume of data.
- Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views .
- Experience writing daily batch jobs using UNIX shell scripts, and developing complex UNIX Shell Scripts for automation of ETL .
- Proficient in implementing complex business rules through different kinds of Informatica transformations, Workflows/Worklets and Mappings/Mapplets .
- Strong knowledge in RDBMS concepts, Data Modeling ( Facts and Dimensions, Star/Snow Flake Schemas), Data Migration, Data Cleansing and ETL Processes.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions .
- Experience in Data warehousing, Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow - Flake Modeling, Data Vault Modeling, FACT and Dimensions Tables.
- Working in Agile space gave an advantage learning different kinds of Test strategies like Functional, Regression and Integration testing .
- Have experience working Onsite and Offshore which gained excellent communication and instant problem-solving skills.
- Advanced Knowledge of Oracle PL/SQL programming, stored procedures & functions, indexes, views, materialized views, triggers, cursors and tuning the SQL query.
- Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
- Hands on experience identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load ( ETL ) and strong understanding of OLTP, OLAP concepts.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Hands on experience in resolving the production issues in 24/7 environment.
- Excellent problem solving, analytical, technical, interpersonal and communication skills with strong leadership abilities, motivated and adaptive with the ability to grasp things quickly. Extremely diligent strong team player with an ability to take new roles.
TECHNICAL SKILLS:
ETL Tools : Informatica Power Center 10.1/9.6/9.5/8.6/8.1/7.1 , Informatica MDM , Metadata Manager, IDQ, SSIS, DataStage.
Reporting Tools : Business Objects XIR2/6.1/5.0, Qlikview, OBIEE, Oracle Analytics, etc.
Databases: Oracle 12C/11g/10g, MS SQL Server 2008/2005/2000 , MS Access, IBM, DB2, Teradata 14.0, Netezaa.
Data Modeling : Data Modeling, Dimensional Data Modeling, Star Schema Modeling, Snow Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
DB Tools: TOAD, SQL Developer, SQL Assistant, Visio, ERWIN, Tivoli Job Scheduler, Control-M, Tidal.
Languages: C, C++, Java, SQL, PL/SQL, Unix Shell Scripting
Operating Systems: UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS
PROFESSIONAL EXPERIENCE:
Confidential - Fort Worth, TX.
Informatica Developer
Responsibilities:
- Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
- Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target Star Schema .
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse .
- Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle RDBMS in accordance with requirements and specifications.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica Power Center/Data Quality ( IDQ ) and proposed ETL strategies based on requirements.
- Performed thorough data profiling to understand the quality of source data and to find data issues using IDQ.
- Involved in massive data profiling using IDQ prior to data staging.
- Created Design Specification Documents including source to target mappings.
- Responsible for performance tuning ETL process to optimize load and query Performance
- Extensively involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures.
- Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
- Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
- Written Pre and Post Session SQL commands (DDL & DML) to drop and recreate the indexes on data warehouse.
- Developed process for Teradata using Shell Scripting and RDBMS utilities such as MLoad, Fast Load (Teradata).
- Extensively used pmcmd commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files.
- Partially involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse
- Used Informatica Data Quality ( IDQ ) for data quality, integration and profiling .
- Extracted data from various source systems like Oracle, SQL Server, XML and flat files and loaded into relational data warehouse and flat files
- Involved writing BTEQ scripts for validation & testing of the sessions, data integrity between source and target databases and for report generation.
- Working on the requirements for upgradation of MDM version.
- Also loaded data into the landing tables of MDM using Power center.
- Migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC .
- Identified the bottlenecks and improved overall performance of the sessions
- Created Dimensions and Fact tables for the data mart and also implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
- Experience in Scheduling Informatica sessions for automation of loads in Autosys .
- Provided production support by monitoring the processes running daily
Environment: Informatica Power Center 10.x/9.x, IDQ, Informatica MDM, Erwin, Oracle 11g/10g, PL/SQL, SQL*Loader, TOAD, MS SQL Server 2005/2008, Autosys.
Confidential - Richardson, TX
Informatica ETL Developer
Responsibilities:
- Designing the source to target mappings that contain the Business rules and data cleansing during the extract, transform and load process.
- Extracted data from different source systems such as Oracle, SQL Server, MS Access, DB2, Mainframes, XML and Flat Files.
- Worked on several transformations such as Filter Transformation, Joiner Transformation, Sequence Generator Transformation, Aggregator Transformation, Source Qualifier Transformation, Expression Transformation, Lookup Transformation (Connected and Unconnected), Joiner Transformation, and Router Transformation, Web services Transformation, XML transformation and Normalizer Transformation in Informatica.
- Developed Informatica mappings with Pushdown Optimization (PDO) to load data from Staging to the EDW layer.
- Used Informatica IDQ to do data profiling of the source and check for the accuracy of data using dashboard.
- Expert in writing SQL, PL/SQL Stored procedures using Toad . Experience with Teradata utilities Fast Load, MultiLoad, BTEQ scripting, fast Export, SQL Assistant, TPUMP.
- Built the Logical Data Objects (LDO) and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the Data.
- Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ .
- Responsible for Code Migration, Code Review, Test Plans, Test Scenarios, Test Cases as part of Unit/Integrations testing, UAT testing.
- Used Informatica power center to load data from different sources like flat files and Oracle, Teradata into the Oracle Data Warehouse .
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables .
- Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
- Extensively used SQL and PL/SQL Scripts.
- Used Deployment group to move Informatica objects from DEV to TEST and from TEST to QA/PROD.
- Provided required support in Multiple ( SIT, UAT, PROD ) environments of the project.
- Involved in analysis and design of MDM application, topology and integration with different application modules.
Environment: Informatica Power Center 9.5/9.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica IDQ, SQL Query Analyzer 8.0, Informatica MDM, Oracle 10g/11g, SQL Developer, Data Loader, SQL Server, BI Publisher, Erwin, Teradata, Unix Shell Scripting, putty, Linux, Tidal(Schedular Tool).
Confidential - Memphis, TN
Informatica ETL/IDQ Developer
Responsibilities:
- Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
- Involved in the creation of Informatica mappings to extracting data from Flat Files to load in to Stage area.
- Designed and Created validation and External loading scripts like MLOAD and FLOAD for Teradata warehouse using Informatica ETL tool.
- Involved in error handling, performance tuning of mappings, Testing of Informatica Sessions, and the Target Data.
- Built the Logical Data Objects (LDO) and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the Data.
- Used all Transformations such as Expressions, Filters, Joiners, aggregators, Lookups, Update strategy, Sequence Generator and Routers to load consistent data.
- Used Informatica Workflow Manager to create Workflows, database connections, sessions, and batches to run the mappings.
- Experienced in loading data into Data Warehouse/Data Marts using Informatica, Teradata Multiload, Fast load and BTEQ utilities.
- Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
- Setting up sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD.
- Design, Develop, Test and Implement the jobs that extract, transform and load data into the Enterprise data warehouse.
- Used IDQ to do data profiling of the master data. Used Informatica Analyst to get an overview of the accuracy of the data and percentage populated.
- Expertise in Exception Handling Mappings using TOAD and PL/SQL for Data quality and Data.
- Knowledge in developing Stored Procedures, Functions, Views and Triggers, complex SQL queries using Oracle PL/SQL, SQL Server and TSQL .
- Expert in identifying and resolving the performance bottlenecks in sources, Targets, Mappings and Sessions.
- Worked on Dimensional modeling to design and develop STAR Schemas, Identify Fact and dimensional tables, and also used ERWIN .
- Involved in Performance Tuning of ETL process at Source level, Target level, Mapping level and Session level.
- Created several Procedures, Functions, Triggers and Packages to implement the functionality in PL/SQL.
- Performed Unit Testing and Involved in tuning the Session and Workflows for better Performance.
Environment: Informatica Power Center 9.x/8.x, Oracle 11g/10g, ERWIN 4.0, PL/SQL developer, XML, UNIX, Windows, Teradata, Linux, Tivoli(Schedular Tool).
Confidential - Palm Coast, FL
ETL Informatica Developer
Responsibilities:
- Used Informatica Power Center for Extraction, Loading and Transformation ( ETL ) of data in the data warehouse.
- Understand the project requirements given by the Client, Co-ordinate with offshore team to translate technical specs to Informatica with Business rules.
- Worked on Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, and Repository Server Administration Console.
- Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
- Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache .
- Extensively worked with Teradata database using BTEQ scripts.
- Used I nformatica Power Center for migrating data from various OLTP databases to the data mart
- Worked on identifying Mapping Bottlenecks in Source, Target and Mappings to Improve performance
- Designed and developed Oracle PL/SQL scripts for Data Import/Export
- Created complex Informatica mappings to load the data mart and monitored them.
- The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
- I nvolved in writing SQL Stored procedures .
- Developed SQL, PL/SQL, stored procedures, triggers, cursors for implementing business rules and transformations.
- Worked with SQL Stored Procedures, Table Partitions and experienced in loading data into Data Warehouse/Data Marts using Informatica.
- Extracted data from different sources like Oracle, flat files, XML and SQL Server loaded into Data Warehouse.
- Worked extensively with different caches such as Index cache, Data cache and Lookup cache ( Static, Dynamic, Persistence and Shared ).
- Involved in writing UNIX scripts and used them to automate the scheduling process
- Performance analysis/tuning across various source/target platforms like SQL Server, ETL Informatica process tuning and system tuning as well.
Environment: Informatica Power Center 8.x, Oracle, PL/SQL, Toad, Control-M, SQL Server, Teradata, Unix, Linux.
Confidential - Detroit, MI
Informatica Developer
Responsibilities:
- Worked closely with the Business analyst to understand the various source data
- Extensively used XML files as source and designed and developed complex Informatica mappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.
- Worked with different sources like Oracle, flat files, XML files, DB2, MS SQL Server .
- Extracted data from sources like fixed width and Delimited Flat files transformed the data according the business requirement and then loaded into Target Data mart.
- Used TOAD and SQL Plus to write queries and interact with Oracle database .
- Involved in writing UNIX scripts and used them to automate the scheduling process.
- Worked on Informatica Power Center Designer - Source analyzer, Warehouse Designer, Mapping Designer and Transformation developer
- Identifying the bottlenecks of Informatica mappings.
- Extensively used Rank, Lookup, Aggregator, Sorter, filter, Update Strategy transformations and created Mapplets.
- Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
- Developed mapping parameters and variables to support SQL override.
- Have Experience on Teradata Utility scripts like BTEQ, Fast Load, MultiLoad and Fast Export to load data from various source systems to Teradata.
Environment: Informatica Power Center , Windows NT, Oracle, SQL, PL/SQL, TOAD, LINUX, UNIX, Teradata.