- Over 8 years of experience in Information Technology with a strong background in Database development and Data warehousing.
- Good experience in designing, Implementation of Data warehousing and Business Intelligence using ETL tools like Informatica Power Center (designer, workflow manager, workflow monitor and repository manager).
- Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
- Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
- Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
- Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and also imported data quality files as reference tables.
- Developed mappings in Informatica IDQ using different transformations Labeler, Standardizer, Address Doctor, Expression, Sorter etc.
- Worked in different phases of the projects involving Requirements Gathering, Design, Development, Deployment, Testing and Maintaining.
- Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
- Experience in E-R modeling, developing Database Schemas like Star schema, Snowflake schema used in relational, dimensional and multidimensional modeling.
- Worked with wide variety of sources like Relational Databases, Flat Files, Mainframes, XML files and Scheduling tools like Control-M, Auto sys and Informatica Scheduler.
- Strong experience working with different RDBMS like Oracle 11g/10g/9i/8i/8.0/7.x, Sybase, MS SQL Server 2008/2005/2000 , Teradata, SQL, PL/SQL, SQL* PLUS, SQL* Loader, TOAD, Stored Procedures, Triggers.
- Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
- Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes) and UNIX Shell scripting.
- Good knowledge on data quality measurement using IDQ and IDE.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
- Proficient in integration of CRM data source such as Salesforce and SAP using Power Exchange.
- Created many frameworks like IMPORT/EXPORT Utility, Restartability, Disable/Enabling Indexes, Update Statistics etc. that enable simpler implementation and faster development of DWBI Solutions and Product.
- Hands on experience working in LINUX, UNIX and Windows environments.
- Experience in working Production Support and migrated the code from DEV to QA to Production.
- Experience creating different documentations like RTM, BRD, HLD, TDD, Unit Test Cases, Deployment Plans and Production Turnover documents.
- Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.
ETL Tools: Informatica Power Center 9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange, IDQ (IDE), Data Stage, SSIS, Pentaho.
Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.
Databases: Oracle11g/10g/9i/8.x, SQL Server 2008/2005/2000 , IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access
Scheduling Tools: Control-M, Autosys, Informatica Scheduler.
Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).
Programming: SQL, PL/SQL, Transact SQL, HTML, DHTML, XML, C, C++, Shell, VBA
Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX
Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.
Confidential, Irving, TX
Environment: Informatica 9.5.x, DB2, Power Exchange 8.6.1,flat files, Passport, Harvest, Windows XP, UNIX and Control -M
- Worked on Informatica 9.5.1 Power center tools- Designer, Repository Manager, Workflow Manager and workflow monitor.
- Used IDE tool to load the data from file to stage tables.
- Created column level Profiles and IDE mappings to analyze the data patterns and statistics for each column in the source.
- Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
- Used Informatica partitions like the Database and Pass-through for performance tuning.
- Involved in Reviews & Unit Testing of Mappings and Workflows.
- Involved in creating mapplets and worklets.
- Used UNIX shell scripts to FTP the source files from EDI servers using the CONFIG files.
- Worked on various tasks of workflows such as the Command, Event wait and event raise.
- Created various parameter files and workflow variables for flexible run of the workflows.
- Modified existing mappings for enhancements of new business requirements.
- Performed unit testing and created documents with the test cases using SQL scripts.
- Used various transformations like the Source Qualifier, Joiner, Expression, Filter, Lookup, Sequence Generator and Router to develop complex mappings using the Power Center Designer.
- Worked on various types of flat files i.e. fixed width and delimited.
- Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
- Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities
- Involve in UAT to get Client GoLive Approval.
- Creating Deployment Groups and Harvest Packages to move the code to production.
- Used Passport in converting the mainframe file sent from the client into flat file.
Confidential, Columbus, OH
Environment: Informatica 9.5.x, Oracle 11g, MS SQL Server 2008, Power Exchange 8.6.1, flat files, Windows XP, UNIX and Control -M
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
- Work extensively on various transformations like Normalize, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator and created mappings.
- Develop an ETL Informatica mapping in order to load data into staging area. Extracted from flat files and databases and loaded into Oracle 11g target database.
- Create workflows and work lets for Designed Mappings.
- Implement logic for Slowly Changing Dimensions to handle Incremental Load for Dimension Tables and Fact Tables.
- Write Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
- Use Workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, scheduled them to run at a specified time.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for the validity of the data from different data sources.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions.
- Work on Parameterize of all variables, connections at all levels in Window XP.
- Involve in migrating the ETL application from development environment to testing environment.
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Perform Data Conversion/Data migration using Informatica Power Center.
- Involve in performance tuning for better data migration process.
- Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
- Helped the company to use the Informatica Virtual Data Machine to map data integration processes once, and then deploy them to run on a workload-specific platforms within the Teradata UDA to power many type of analytics - all within a single data architecture.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Automated the jobs using Control-M Scheduler.
- Create VB Macro scripts in excel to interact with DB2 to produce flat files.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
Confidential, Boston, MA
Informatica Prod Support Analyst
Environment: Informatica Power Center 9.1, Power Exchange, PL/SQL, Oracle 11g, SQL Server 2005/2000, Windows Server 2003, UNIX, IDQ(IDE), Autosys, Toad, Pentaho
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
- Involved in creating Logical and Physical Data Models and creating Star Schema models.
- Assisted in designing Star Schema to design dimension and fact tables.
- Involved in developing star schema model for target database using ERWIN Data modeling.
- Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
- Created Mapplet and used them in different Mappings.
- Used sorter transformation and newly changed dynamic lookup.
- Created events and tasks in the work flows using workflow manager.
- Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
- Using the data Integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts.
- Good Experience in creating cubes by using Pentaho Schema Workbench.
- Created Schema objects like Indexes, Views and Sequences.
- Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and also imported data quality files as reference tables.
- Used IDE to identify data duplication, understand the quality of data and report generation.
- Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
- Created Stored Procedures, Functions and Indexes Using Oracle.
- Worked on batch processing and file transfer protocols.
- Prepared Batch Script to automate the ETL Load.
- Performance tuning of the workflow which are taking time.
- Analyzed the enhancements coming from the individual Client for application and implemented the same.
- Creation of technical documents.
- Developed mappings for policy, claims dimension tables.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.
Confidential, Hartford, CT
Environment: Informatica Power Center 9.1, Informatica Power Exchange, Oracle 11g, PL/SQL, SQL Server, DB2, Teradata 13.10, Clear Case, Erwin, Business Objects Info view, Windows, UNIX
- Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Involved in design phase of logical and physical data model using Erwin 4.0.
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files.
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Developed mappings to load Fact and Dimension tables, SCD Type I and SCD Type II dimensions and Incremental loading and unit tested the mappings.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, Fast Exports, Multi loads and Fast Load scripts, Reviewed Unit Test Plans and System Test cases.
- Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
- Created BTEQ scripts to extract data from EDW to the Business reporting layer.
- Developed BTEQ scripts for validation and testing of the sessions, data integrity between source and target databases and for report generation.
- Loaded data from various data sources into Teradata production and development warehouse using BTEQ, FastExport, multi load and FastLoad.
- Used Teradata Administrator and Teradata Manager Tools for monitoring and control the systems.
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Involved in creating different types of reports including OLAP, Drill Down and Summary in BO.
- Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
Confidential, Memphis, TN
Environment: Informatica Power Center 8.6.1/8.1.1 , Oracle 11g, TOAD 10.1 for Oracle, DB2, Flat Files, PL/SQL, OBIEE 11g, SAP, ERWIN, Windows 2000, UNIX PERL scripting, Autosys
- Designed, developed and documented the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from the various source systems.
- Worked only Informatica 8.6.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Involved in design and development of complex ETL mappings.
- Implemented partitioning and bulk loads for loading large volume of data.
- Based on the requirements, used various transformations like Source Qualifier, Normalize, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
- Developed Mapplets, Worklets and Reusable Transformations for reusability.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, Mappings, transformations and sessions to optimize session performance.
- Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
- Extracted data from SAP sources using SAP Power Connect (SAP plugin).
- Performance tuning by session partitions, dynamic cache memory and index cache.
- Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
- Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
- Created Stored Procedures in PL/SQL.
- Created UNIX Shell scripts to automate the process.
- Developed Documentation for all the routines (Mappings, Sessions and Workflows).
- Involved in scheduling the workflows through Autosys Job scheduler using UNIX scripts.
- Played a key role in all the testing phases and responsible for production support as well.
Informatica ETL Prod Support
Environment: Informatica Power Center 8.5.x, ETL, Oracle 10g, SQL Server2000, MS Access, SQL, PL/SQL, Windows NT
- Interacted with business analysts, data architects, application developers to develop a data model.
- Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica Power Center 8.5.x tool.
- Created sessions, database connections and batches using Informatica Server Manager/Work flow Manager.
- Optimized mappings, sessions/tasks, source and target databases as part of the performance tuning.
- Designed ETL Mappings and Workflows using power center designer and workflow manager.
- Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to target tables.
- Developed various Mappings, Mapplets and Transformations for data marts and Data warehouse.
- Used Shell Scripting to automate the loading process.
- Used VBA Macro excels to compare data to show proof of concept for Unit testing.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Analyzed and Created Facts and Dimension Tables.
- Used Informatica features to implement Type I and II changes in slowly changing dimension tables.
- Used command line program pm cmd to communicate with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
- Designed processes to extract, transform and load data to the Data Mart.
- Performed impact analysis, remedial actions and documented on process, application failures related with Informatica ETL and Power Analyzer.
- Performed regular backup of data warehouse, backup of various production, development repositories including automating and scheduling processes. As part of optimization process, performed design changes in Informatica mappings, transformations, sessions.