- Close to 7 years of IT Experience in Data Warehouse using Informatica Power Center.
- Experienced in Requirement Analysis, Requirement Gathering, Functional Analysis, System Design, Process Flow, Workflow, Use and Test Case Development, Software Development Life Cycle, System Testing, Agile Development, Bug Fixing, Documentation, Implementation, Production Support.
- Maintenance and Integration of various data sources such as Oracle, DB2, MS SQL Server, Fixed Width/Delimited Flat Files, XML and COBOL Sources using Informatica Power Center 9.5/9.1/8.5 on Windows and UNIX.
- Sound knowledge of RDBMS concepts, wif hands on experience in teh development of relational database environment using SQL, PL/SQL, DB2, Stored Procedures, Packages and UNIX Shell Scripting.
- Worked exclusively on Informatica Data Quality (IDQ) for data quality and data profiling, applying rules and develop mapping to move data from target system
- Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked wif Informatica and other consultants to develop IDQ plans to identify possible data issues.
- Worked exclusively on Informatica Data Quality (IDQ) for data quality and data profiling.
- Experienced in Master data Management (MDM), MDM Hub configurations, data modeling, Data mapping, creating source tables and target tables and data profiling
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
- Experienced in job scheduling using Control - M and IBM Tivoli Workload Scheduler (TWS) on Windows environments.
- Experienced in Online and Batch Mainframe environments and production supports for both.
- Team player, self-motivated and dynamic wif excellent oral and written communication skills.
- Quick learner and adaptive to new environments.
Languages: COBOL, JCL, SQL, PL/SQL, C, C++
Scripting: UNIX Shell Scripting
Databases: Oracle 11g/10g/9i, SQL Server, DB2, IMSDB
ETL Tools: Informatica PowerCenter v 10.1/9.5/9.1/8.x, Developer 9.0.1 and SSIS
Data Modeling Tools: ERwin Data Modeler 9.2
Other Tools: SQL developer, Universal SQL Editor, TWS, Toad, WinSCP, QC, IBM MQ Series, Control-M and IBM Maestro Tivoli
Operating Systems: UNIX-AIX, Windows 95/98/2000/NT/XP/07/08, MS DOS, MVS, OS/390
Methodologies: Waterfall, Star Schema, Snow Flake Schema, Fact and Dimensional Tables, Physical and Logical Data Modeling, Agile.
Confidential, Juno Beach, FL.
Sr. DWH Consultant
- Analyzed CIM (Customer Information Management) application system.
- Involved in meetings wif business teams to discuss requirements.
- Involved in functional and technical design documents prep meetings.
- Prepared BSTMs (Business Source to Target Mapping definition) as per requirements.
- Developed and unit tested mappings, sessions and workflows wif sources like relational tables and flat files.
- Exclusively worked on code debugging using Informatica debugger for complex mappings.
- Prepared code promotion documentation and ICEM packages for production deployment.
- Involved in monitoring and supporting production jobs.
- Conducted and involved in defect triage meetings on daily and weekly basis.
- Worked on data cleansing and standardization using teh cleanse functions in Informatica (MDM).
- Involved wif Informatica team members in Designing, document and configure teh
- Informatica MDM Hub to support loading, cleaning, matching, merging and publication of MDM data.
- Use ETL to load data from sources such as Flat Files, Oracle to Oracle, Teradata Target Database.
- Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, DB2, Teradata and Flat files.
- Prepared project documents like Recovery specifications, Project Change Request Form (PCRF) and Promotion Worksheet.
British Petroleum, Texas.
Sr. DWH Consultant
- Evaluated client’s new design changes and suggested few corrections for better quality and performance.
- Led and coordinated offshore team and provided clarifications.
- Developed Shell Scripts for file copy and database table purge etc.
- Data Quality checks for source files using Control Frame Work (CFW) and Mapplets.
- Designed and developed stored procedures for table truncation before loading.
- Scheduled teh workflows to run on a daily, weekly and monthly basis using Control-M Scheduling Tool.
- Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created teh Score Card to present it to teh Business users for a trending analysis (Informatica Analyst)
- Prepared QA and production implementation documents for multiple releases.
- Involved in Design Pattern discussions for new environment and client meetings.
- Performed unit testing for developed ETL mappings before deploying to QA.
- Interacted wif database DBAs and Informatica Admin’s and conducted business meetings for implementation.
- Documented mapping specifications, STM, unit test cases, procedure and results.
- Worked wif stakeholders regarding business requirements, functional specifications and enhancements, based on teh business needs created technical design and functional specification documents.
- Actively participated in weekly calls wif data modeling and analyst teams to understand and work on any new requirements.
- Analyzed data from source systems to design teh solution for teh business requirement.
- Developed complex mappings in Informatica using Power Center transformations (Source Qualifier, Joiner, Lookups, Filter, Router, Aggregator, Expression, XML Update and Sequence Generator Transformations), Mapping Parameters/Variables, Parameter files, SQL overrides and Transformation Language. Data profiling teh source files and developing data model and mappings for smaller requirements.
- Implemented CDC, SCD2, SCD1 Delta load, Snapshot and transactional fact tables, headers and footers to Flat File and File List.
- Created data quality transformations using IDQ for data quality checks and data conversions.
- Developed Unix Scripts for SFTP file transfers and target table truncate operations.
- Implemented partitioning at database level for better performance.
- Scheduled teh workflows to run on a daily and weekly basis using Control-M Scheduling Tool.
- Involved in unit tests and unit test plan document preparation.
- Documented mapping specifications, STM, Unit Test Cases, procedure and results.
- Implemented Push Down Optimization (PDO) for better performance when source data is huge.
- Worked on IDQ/IDE tools for data profiling, data enrichment and standardization.
- Analyzed data in source systems for solution design document of teh business requirement.
- Designed functional documents and Technical design documents according to business needs for ODS.
- Prepared DDLs for creating new tables and adding new metadata columns as per teh design specification.
- Implemented database partitioning by HASH and RANGE.
- Extensively worked wif Data Analyst and Data Modeler to design and to understand teh structures of Fact and Dimension tables. Data profiling and data quality checks using Informatica Data Quality (IDQ).
- Prepared source to target mapping documents for teh new mappings developed and updated existing.
- Developed complex Mappings using Informatica transformations and developed Reusable Transformations and Mapplets to load OLTP data into OLAP system by using multiple mappings.
- Performed system integration testing (SIT) and was involved in User Acceptance Testing (UAT) to ensure business and technical requirements are met and documented teh results.
- Involved in teh documentation of Source to Target mapping, mapping design and implementation steps.
- Developed Informatica mappings, enabling teh extract, transport and loading of teh data into target tables.
- Created Workflow, Worklets and Tasks to schedule teh loads at required frequency using Workflow Manager.
- Prepared reusable transformations to load data from operational data source to Data Warehouse.
- Wrote complex SQL Queries involving multiple tables wif joins.
- Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
- Used debugger, session logs and workflow logs to test teh mapping and fixed teh bugs.
- Analyzed teh dependencies between teh jobs and scheduling them accordingly using teh Work Scheduler.
- Improved teh performance of teh mappings, sessions using various optimization techniques.
- Developed Informatica mappings, enabling teh ETL process for large volumes of data into target tables.
- Designed and developed process to handle high volumes of data and high volumes of data loading in each SLA.
- Created and used parameter files to perform different load processes using teh same logic. Extensively used PL/SQL for creation of stored procedures and worked wif XML Targets, XSD and DTD.