- 8 Years of experience in Information Technology as Informatica Developer with strong background in ETL Data warehousing experienced using Informatica Power Center 9.x/ 8.x/ 7.x.
- Good experience in Informatica Installation, Migration and Upgrade Process.
- Experience in using Informatica Power Center Transformations such as Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Experience in integration of various data sources like SQL Server, Oracle, Flat Files, and XML files.
- Experience in developing XML/ XSD/ XSLT as a part of Source XML files for Informatica and also
input XML for Web service Call.
- Proficient knowledge and hands - on experience in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL processes.
- Good knowledge of Dimensional Data Modeling, ER Modeling, Star Schema/Snowflake Schema, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2, type3.
- Experienced in loadingdata, troubleshooting, Debugging, mappings, performance tuning of Informatica (Sources, Targets, Mappings and Sessions) and fine-tuned transformations to make them more efficient in terms of session performance.
- Database experience using Oracle11g/10g/9i, Teradata, MS SQL Server 2008/2005/2000 and MS Access.
- Experience in UNIX Operating System and Shell scripting.
- Working knowledge of data warehouse techniques and practices, experience including ETL process, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLTP and OLAP.
- Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts and Data warehouse using Informatica Power Center.
- Used Debugger in Informatica Power Center Designer to check the errors in mapping
- Experience in writing complex SQL queries. Experience in performance tuning the HiveQL and Pig scripts.
- Experience in working with Oracle, Netezza databases. Experience working on Hadoop using Hive database (HUE). Experience in integration of data sources like Oracle 11G and Flat Files.
- Created Views in Hive database to load into Hive and Netezza databases.
- Experience in using Informatica Utilities like Pushdown optimization, Partition and implemented slowly changing dimensions Type1, Type2 methodology for accessing the full history of accounts and transaction information.
- Excellent skills in fine tuning the ETL mappings in Informatica.
- Extensive experience using database tool such as SQL *Plus, SQL *Developer, Autosys and TOAD.
- Effective working relationships with client team to understand support requirements, and effectively manage client expectations.
- Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
Data Warehousing/ ETL Tools: Informatica PowerCenter 9.x/8.x/7.x/6.x (Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet, Transformation, Sessions, Workflow Manager-Workflow, Task, Commands, Worklet, Transactional Control, Constraint Based Loading, SCD I, II,Data Flux, Datamart, OLAP, ROLAP, MOLAP, OLTP.
Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes, Cardinality, ER Diagrams.
Databases: Oracle 11g/10g/9i/8i,Teradata,MS SQL Server 2008/2005/2000 , MS Access and DB2
Languages: SQL, PLSQL C,C++,Data Structures, Unix Shell Script, Visual Basic
Web Technologies: XML,HTML, Java Script
Tools: Toad, SQL* Developer,Autosys,Erwin
Operating Systems: Windows Server, NT/2008/2003/XP/Vista/7, UNIX, MS-DOS and Linux
Confidential, Minnetonka, MN
Sr. ETL Informatica Developer
- Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
- Extensively used Power Center to design multiple mappings with embedded business logic.
- Involved in discussion of user and business requirements with business team.
- Performed data migration in different sites on regular basis.
- Involved in upgrade of Informatica from 9.1 to 9.5.
- Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
- Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
- Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
- Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
- Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
- Analyzed session log files in session failures to resolve errors in mapping or session configuration.
- Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
- Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
- Created mapplets and used them in different mappings.
- Worked on Flat Files and XML, DB2, Oracle as sources.
- Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.
- Implemented Slowly Changing Dimension Type II for different Dimensions.
- Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
- Worked with Informatica version Control excessively.
- Experience in using SVN as version control for migration.
- Written Unit test scripts to test the developed interfaces.
- Managed enhancements and coordinated with every release with in Informatica objects.
- Provided support for the production department in handling the data warehouse.
- Worked under Agile methodology and used Rally tool one to track the tasks.
- Written thorough design docs, unit test documentation, Installation and configuration guide documents.
- Performed bulk data imports and created stored procedures, functions, views and queries.
Environment: Informatica Power Center 9.6.1/9.5.1 , Teradata 14/12, Oracle 10g, DB2, XML, Flat Files, Sql Assistant, Toad, PL/SQL, Unix shell scripting, Cognos, SVN, Windows 7.
Confidential, San Francisco, CA
- Interaction with the business to get the requirements and analyze the scope
- Requirements elicitation and translation to technical specifications
- Coordinating with source teams to understand their data structure before sourcing the data
- Preparing high level and low level design documentation, which defines the overall system architecture along with business rules and process diagrams
- Proposing and reviewing task estimates and communicating to all stakeholders.
- Developing mappings and workflows to source data from multiple different sources like mainframe files, flat files, XML and relational tables
- Written Python code to get the Inventory data from SLIS server using BMS messages.
- Written Python code to load extracted data into Sybase ASIQ DB
- Written Python code to get the Spectra Data from CRM server using BMS messages and generated the csv files for each datatype.
- Worked on Non Cash Collateral Server and Client design, architecture and implementation using Aladdin technologies.
- Involved in successful HAL migration of Sapphire application with database migration from solarise to linux.
- Designing incremental load process to load data into staging tables
- Performing root cause analysis on audit queries, jira tickets and resolving all production issues
- Creating autosys scripts to schedule jobs and add dependency between different jobs
- Creating and maintaining the shell scripts and parameter files in UNIX for the proper execution of Informatica workflows in different environments
- Creating reference guides for commonly occurring errors stating the reason and mitigation steps
- Unit testing to check whether the data loads into target are accurate
- Working with reporting team and proving appropriate data into reporting tables which are used for the purpose of reports
- Worked with SAP Power Designer for data modelling of Non Cash Collateral, Inventory, Prism filling lots projects.
- Created reports using Webi designer and Crystal designer for customized and ad-hoc queries.
- Supported testing team in running their test cases and also to help them understand business requirements behind the stories implemented.
- Worked on creating reports using Tableau to replicate Sapphire BI reports in ADW Tableau
Environment: Informatica Power Centre, Sybase IQ, Unix, Autosys, Python, JIRA, GIT, Tableau, SAP BI
Confidential, Houston, TX
- Gathered user Requirements and designed Source to Target data load specifications based on business rules.
- Used Informatica Power Centre 9.0.1.for extraction, loading and transformation (ETL) of data in the datamart.
- Participated in the review meetings with functional team to signoff the Technical Design document.
- Involved in Design, Analysis, Implementation, Testing and support of ETL processes.
- Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica Power Center.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Coded PL/SQL stored procedures and successfully used them in the mappings.
- Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
- Scheduled the Jobs by using Informatica scheduler& Jobtrac
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing of Informatica mappings.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning at various levels like mapping level, session level, and database level.
- Provided production support by monitoring the processes running daily.
- Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
- Coordinating with the Offshore team and directly interacting with the client for clarifications & resolutions
- Introduced and created many project related documents for future use/reference.
- Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
- Developing several complex mappings in Informatica a variety of Power entertransformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
- Built complex reports using SQL scripts.
- Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.
- Created complex mappings to load the data mart and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normaliser and Sequence generator transformations.
- Ran the workflows on a daily and weekly basis using workflow monitor.
Environment: Informatica 9.0.1,PL/SQL,Informatica 8.6.1, 9.5, Oracle 9i, UNIX, SQL, PL/SQL, Informatica Scheduler, SQL*loader, SQL Developer, Framework Manager, Transformer, Teradata, Oracle 11g, TOAD, Windows Server 2008, UNIX.
Confidential, Austin, TX
ETL Informatica Developer
- Participated in all phases of system development life cycle from requirements gathering to deployment of the finished system into production followed by maintenance and knowledge transfer tasks.
- Gathered requirements by analyzing source systems and identification of business rules through regular requirements gathering sessions with business users and other support teams for various OLTP and OLAP systems.
- Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
- Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
- Extensively worked on Power Center Client Tools like Repository Admin Console, Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
- Extensively worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Analyzed the source data coming from different sources (Oracle, DB2, XML, QCARE, Flat files) and worked on developing ETL mappings.
- Good experience in installation of informatica Power Exchange.
- Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis.
- Implemented mapping level optimization with best route possible without compromising with business requirements.
- Created Sessions, Reusable Worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency.
- Worked on fixing invalid Mappings, testing of Stored Procedures and Functions, and Integration Testing of Informatica Sessions.
- Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
- Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
- Performed Data profiling for data quality purposes.
- Proven Accountability including professional documentation, and weekly status report.
- Performed Quantitative and Qualitative Data Testing.
- Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.
- Used Power Center Designer to design the business process, grain of the data representation, dimensions and fact tables with measured facts.
- Extensively used Transformations like lookup, Router, Filter, Joiner, Source qualifier, Aggregator, Update strategy Etc.
- Involved in performance tuning of sessions that work with large sets of data by tweaking block size, data cache size, sequence buffer length and target based commit intervals.
- Developed sessions and batches to move data at specific intervals and on demand using workflow manager.
- Participated in deployment planning and in deployment of the system to production.
- Facilitated business user smoke testing of the production system by setting up test data.
- Involved in production support duties including monitoring of nightly batches.
- Responsible for updating business stakeholders and OLTP/OLAP application support teams about the status of various ETL sessions and the impact of failed sessions on data availability.
Environment: Informatica Power Center 9.1., Power Center 7.5.1, DB2, Oracle 10g, UNIX, Win XP Pro, TOAD, Autosys, Thompson Advantage Suite,Oracle10g, SQL*PLUS, TOAD, DB2, UNIX, Windows
- Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
- Developed mappings and mapplets using Informatica Designer to load data into ODS from various transactional source systems.
- Used Informatica Designer to import the sources, targets, create various transformations and mappings for extracting, transforming and loading operational data into the EDW from ODS.
- Used various transformations such as expression, filter, rank, source qualifier, joiner, aggregator and Normalizer in the mappings and applied surrogate keys on target table.
- Used the Informatica Server Manager to register and monitor the server, create and run the sessions/batches for loading the data using the earlier created mappings.
- Created mapplets and reusable transformations.
- Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
- Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
- Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.
Environment: Informatica Power Center 8.x, Repository Manager, Designer, Oracle 8i, SQL, UNIX, Win 2000/NT.