- Over 7 years of IT experience in Informatica PowerCenter in all phases of Analysis, Design, Development, Implementation and support of data warehousing applications using Informatica PowerCenter 9.6.1/9.5.1 /8.6/8.5/7.5
- Extensive experience in data warehouse applications including Informatica, Oracle, Teradata, DB2 , MS SQL server on windows, IBM and UNIX platforms.
- Expertise in building Teradata BTEQ scripts and utilizing Teradata utilities such as TPT, Fast Load, Fast Export, MLOAD (Mulitiload) to meet various business requirements.
- Experience in creating complex mappings using various transformations and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informtaica 9.6.1/8.x
- Data Processing experience in designing and implementing data mart applications, mainly transformation processes using Informatica PowerCenter.
- Strong knowledge in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema, Fact and Dimensional tables.
- Extensively worked on Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
- Experience in documenting High Level and Low Level Design , Source to Target Mapping ( STM ), Macro and Micro Documents , Unit test plan, Unit test cases and Code Migration Report.
- Worked with Business analysts and the DBA for requirements gathering, business analysis, designing and documentation of the data warehouse.
- Proficient in using Informatica workflow manager, Workflow monitor to create, schedule and control workflows, tasks, and sessions
- Experienced in Repository Configuration/using Transformations, creating Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems into targets.
- Worked in Agile Methodology for SDLC (Software Development Life Cycle) and utilize Agile scrum meetings for creative work.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
- Experience in UNIX Operating System.
- Proficient in understanding business processes / requirements and translating them into technical requirements.
- Excellent communication skills, technically proficient and result - oriented with problem solving skills and ability to work independently.
Data Warehousing ETL Tools: Informatica Power Center 9.6.1/9.5.1/8.x/7.x, Informatica Power Analyser/ Powermart, Crystal Reports.
Operating System: Windows/XP/Vista/08/07, UNIX, Linux, IBM Mainframes
Databases: Oracle 11g/10g/9i/8.0, MySQL, SQL Server 08//05, Teradata
Programming: SQL, PL/SQL, SQL*Loader, Unix, Shell Scripting, C, Java, HTML. XML
Dimensional Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Erwin.
Reporting Tools: SSRS, SSAS, Cognos, SAP BO
Tools: TOAD, SQL Developer, SQL *Plus, MS Office, Microsoft Visio, Sharepoint, Putty, WinScp
Confidential, Nashville, TN
- Involved and understanding the Business requirements/ discuss with Business Analyst, analyzing the requirements and preparing business rules.ETL DEV LEAD.
- Design and Developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update, Aggregator, Router, Stored Procedure to implement complex logics while create mappings .
- Developed mappings using Informatica to load the data from sources such as Relational tables, Flat files, Oracle tables into the target Data warehouse.
- Created and maintained load jobs using Teradata utilities Bteq, Fastload and Mload for EDW.
- Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplets designer using Informatica PowerCenter.
- Experienced in methodologies like SDLC and Agile methodologies such as SCRUM.
- Contributed and actively providing comments for user stories review meeting within an AGILE SCRUM environment.
- Extensively worked with transformations like Lookup, Expression, Router, Joiner, Update Strategy, Filter, and Aggregate.
- Extensively worked on SQL override in Source Qualifier Transformation for better performance.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL.
- Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Designed and developed stored procedures using PL/SQL and tuned SQL queries for better performance.
- Implemented slowly changing dimension (SCD Type 1 & 2) in various mappings.
- Used Incremental Aggregation technique to load data into aggregation tables for improved performance.
- Created and used reusable Transformation, Mapplets using Informatica PowerCenter.
- Created session and work flows to help schedule mighty loads and process data from all source.
- Worked on different tasks in Workflows like sessions, event raise, event wait, decision, e-mail, command, Assignment, Timer and scheduling of the workflow.
- Experience in writing UNIX and automation of the ETL processes using UNIX shell scripting.
- Prepared ETL standards, Naming Conventions and wrote ETL flow documentation.
- Worked on autosys to run informatica jobs parallel.
Environment: Informatica PowerCenter9.5/9.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Power Mart, Cognos, SSRS, Oracle 11g/10, Teradata, SQL, PL/SQL, Unix, Erwin, Putty, Winscp, QC(Quality Center),Windows XP, Auotsys.
Confidential, Bethesda, MD
- Worked closely with business analyst to Analyze and understand the business requirements.
- Perform data analysis for any requirement and provide source to target mapping rule document
- Designed and developed complex aggregate, join, lookup transformation to generate and consolidate (fact and summary) data using Informatica Power Center tool.
- Used the Slowly Changing Dimensions (SCD type 2) to update the data in the target dimension tables.
- ETL Dev Lead.
- Involved in Creating Fact and Dimension tables using Star schema; Using various IDQ Transformations, filter, sorter,labeler, etc.
- Created and maintained load jobs using TERADATA utilities Bteq, Fastload and Mload for EDW and Using Teradata SQL and Bteq scripts to schedule to run daily, weekly and monthly.
- Created sessions, database connections and batches using Informatica Server Manager/Workflow Manager.
- Extensively used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic implanted in the mappings.
- Coordinated all ETL (Extract, Transformation and loading) activities and enhancements using Agile Methodology.
- Extensively used Informatica debugger to find out the problems in mapping. Also involved in troubleshooting and rectify the bugs.
- Attended stand up meeting, user story review meetings and provided feedback to the Scrum Master.
- Optimized mappings, sessions/tasks, source, and target databases as part of the performance tuning.
- Involve in Unit testing, Functional testing and Regression testing.
- Configured the server and email variables using Informatica Server and Workflow Manager.
- Experienced in writing complex SQL queries to verify data from Source to Target
- Extensively used types of caches like static, dynamic and persistent caches while creating sessions and tasks.
- Used XML Generator transformation accepts data from multiple ports and writes XML through a single output port.
- Designed processes to extract, transform, and load (ETL) data to the Data Mart.
- Involvement in all phases of software development life cycle in this project which includes but not limited to Requirement gathering, analysis, technical design, development, testing, and user acceptance testing.
Environment: Informatica PowerCenter 8.6, Oracle 10g,Toad 9.5, UNIX, XML, PLSQL, Erwin 4.0, Putty, Winscp, Windows XP/Vista, Quality Center.
Confidential, Cypress, CA
- Worked with Informatica power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Extensively working Informatica Transformation like Source Qualifier, Rank, Router, Filter, Joiner, Lookup, Aggregator, Union, and Sorter etc.
- Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
- Analyze Business requirements provided by Business Analysts for the code changes and provided feedback as per requirements.
- Created complex Aggregate, Expression, Join, Filter, Router, Lookup and Update transformation.
- Designed and created mappings, defined workflows, tasks, monitored sessions, exported and imported mappings and workflows, recovery and backups.
- Monitor and track the applications as per the schedule run Autosys.
- Designed for populate target tables for one time load and Incremental loads.
- Extracting, Transform and loading of data from flat file, sources to target using transformations in the mappings.
- Experiencing in writing shell scripts and control files to load data into staging tables and then into Oracle base tables using SQL*Loader.
- Used performance tuning logic on Source, Targets, Mappings, Sessions to provide maximum effectiveness and performance
- Created a batches and sessions to schedule the loads at required frequency using Power Center server manager.
- Completed documentation in relation to detailed work plan, mapping documents, high-level and low-level data models.
Environment: Informatica PowerCenter 8.6, Oracle 10g, SQL, PL/SQL,UNIX Shell Scripting, Oracle SQL developer, Windows XP/Vista
Confidential, NY, NY
Informatica Power Center Developer:
- Analyzed source and target systems for loading data into CDW and to create external data feeds.
- Involved all the phases of SDLC design and development.
- Involved in gap analysis and STM documentation to map flat file fields with relational table data in CDW.
- Developed mappings, mapplets, re-usable transformations to load external data into CDW.
- Extracted data from CDW to create delimited and fixed width flat file data feeds that go out to external sources.
- Analyzed Design documents and developed ETL requirement specs.
- Created re-usable objects and shortcuts for various commonly used flat file and relational sources.
- Developed a shell script in order to append Date and Time stamp for output xml files, to remove empty delta files and to FTP the output xml files to different servers.
- Validate Logical Data Models relationships and entities, Determine data linage by including all associated systems in data profile.
- Excellent Data Profiling experience used Data Polling validating data patterns and formats.
- Integrated data into CDW by sourcing it from different sources like Oracle, Flat Files and Mainframes (DB2 and VSAM) using Power Exchange.
- Technical Analysis writing & reviewing technical designs.
- Created UNIX scripts to deal with flat file data like merging delta files with whole files and to concatenate header, detailed and trailer parts of the files.
- Developed Mappings which loads the data in to Teradata tables with SAP definitions as Sources.
- Created mappings to read parameterized data from tables to create parameter files.
- Used XML Source Qualifier is used only with an XML source definition. to represents the data elements that the Informatica Server reads when it executes a session with XML Developed.
- Used XML Parser transformation lets you extract XML data from messaging systems.
- Used ODI to perform data integration to ELT processing. Data is extracted from multiple sources, sent through several transformation processes and loaded into a final destination target.
- Used SAS for data entry, Retrieval, Management report writing and Statistical analysis.
- Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
- Used Message broker as translate messages between the interfaces.
Environment: Informatica PowerCenter 7.5, Power Exchange, Windows, Oracle, Toad for Oracle, UNIX, workflow scheduler, Perl.
Confidential, Los Angeles, CA
Informatica Power Center Developer
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
- There were two kinds of data loading processes (Daily and Weekly) depending on the frequency of source data.
- Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica.
- Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
- Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator.
- Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager.
- Used UNIX cron jobs to schedule the sessions to automate the update process in the workflow manager
- Worked with pmcmd to interact with Informatica Server from command mode and execute the shells scripts.
- Worked with different sources such as Oracle, MS SQL Server and flat files.
- Knowledge of slowly changing dimension tables and fact tables.
- Writing documentation to describe program development, logic, coding, testing, changes and corrections.
- Optimizing the mappings by changing the logic to reduce run time.
Environment: Informatica Power Center 7.1,Oracle 9i, ERwin 4.0, SQL, PL/SQL, Windows NT/2000, HP Quality Center, Unix, Putty, WinScp, Quality Center.