- 8 years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
- Experience in designing and implementing data warehouse applications, mainly in ETL processes using Informatica Power Center v10.x/v9.x/v8.6.1/v8.1 using Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Consultant), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
- Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
- Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
- Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
- Experience working with Cloud Computing on Platform Salesforce.com
- Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
- Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software , Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.
- Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
- Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of the transformation to cleanse and profile the incoming data by using Standardizer, Labeler, Parser, Address Validator Transformations of experience in using different versions of Oracle database like 11g/10g/9i/8i.
- Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
- Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
- Experience in creating batch scripts in DOS and Perl Scripting.
- Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support .
- Experience in both Waterfall and Agile SDLC methodologies.
- Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
- Data modeling experience in creating Conceptual, Logical and Physical Data Models using ERwin Data Modeler.
- Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
- Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.
- Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
- Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
- Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
- Hands on experience in MDM development.
- Involved in the designing of Landing, Staging and Base tables in Informatica MDM.
- Created MDM mapping and configured match and merge rules to integrate the data received from different sources.
- Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
- Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
- Extensive knowledge in all areas of Project Life Cycle Development.
- Strong analytical, verbal, written and interpersonal skills.
Databases: Oracle 11g/10g/9i/8i, SQL Server 2005/2008, MS-Access, DB2 8.2, Teradata
Operating Systems: Unix, Windows
Languages: PL/SQL, T-SQL, UNIX Shell
Tools: Informatica Power center v10.x/v9.x/v8.6.1/v8.1, Informatica IDQ, Informatica Cloud, TOAD, MS Office Tool Set, SQL Server Analysis Services, HP Quality Center 9.2/10, Rapid SQL, MQ Series.
OLAP: OBIEE 10.1.3.X, 11g, Siebel Analytics 7.8/7.7, COGNOS 6.x (Framework Manager, Report Studio)
GUI: Visual Basic 6.0, Oracle FORMS 4.5, Oracle REPORTS 2.5
Scheduling Tools: Autosys, Informatica Scheduler
Methodologies: Star Schema, Snow Flake Schema, Ralph Kimball Methodology
Reporting Tools: Business Objects, COGNOS
Confidential, Long beach, CA
Sr Informatica Developer/IDQ Developer
- Worked with Informatica Data Quality 10.0 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.0
- Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
- Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
- Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
- Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
- Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
- Perform Informatica MDM design, implementation, experience on master data management Informatica Data Quality (IDQ 9.6.1) is the tool used here for data quality measurement.
- Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
- Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
- Experienced with Informatica PowerExchange for Loading/Retrieving data from mainframe systems.
- Building cubes using Pentaho schema workbench to design analysis and dashboards and designing interactive reports with Pentaho report designer.
- Identifying the existing system opf financial planning.
- Customize the EPM product based on the requirement and existing files.
- Developed ETL process using Pentaho PDI to extract the data.
- Design the Source - Target mappings and involved in designing the Selection Criteria document.
- Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
- Responsible for manually start and monitor production jobs based on the business user’s requests.
- Responsible to look into production issues and resolved them in timely manner.
- Developed Informatica process to replace stored procedure functionalities and provide a time effective and high data quality application to the client.
- Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
- Prepared ETL Specifications and design documents to help develop mappings.
- Created Mappings for Historical and Incremental loads. in and checkout versions of objects.
- Used Jenkins for deploying in different environments.
- Configured and maintained source code repositories in Clearcase, GIT.
- Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
- Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
- Project based on Agile SDLC methodology with 2 weeks of software product release to the business users.
- Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
- Provide post release/production support.
Environment: Informatica Power Center 10.1, IDQ 10.0, Informatica MDM 10.0, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts, Teradata.
Confidential, Atlanta, GA
Sr. Informatica IDQ Developer
- Analyzed and thoroughly studied various data sources and different development environments within the organization.
- Extensively worked on extracting the data from various flat files (fixed width, delimited), applying the business logic and then loading them to the oracle databases.
- Extensively worked with Source qualifier, Filter, Joiner, Expression, Lookups, Aggregator, Router, Sequence Generator, and Update Strategy.
- Used various informatica transformations in the development of complex mappings.
- Extracted data from different heterogeneous source systems applied business logic using transformations and loaded to the target systems using Informatica power center.
- Used Informatica Data Quality (IDQ) to profile the data and apply rules to Membership & Provider subject areas to get Master Data Management (MDM).
- Worked on multiple projects using Informatica developer tool IDQ
- Involved in migration of the maps from IDQ to power center.
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes
- Worked closely with business for requirement gathering and to understand the project needs.
- Interacted with environmental team and data architects in design and implementation data models.
- Designed and developed complex mappings to load the Historical, Weekly and Daily files to Oracle database.
- Extensively worked on data conversion and data cleansing.
- Created different move-it jobs to have inbound/outbound transition of files between WellCare and different vendors.
- Coded UNIX shell scripts, created command task, and email task for providing the pre-session post-session requirements for various informatica jobs.
- Provide database coding to support business applications using T-SQL
- Worked on automation of informatica job flow using autosys boxes/jobs.
- Extensively worked on basic SQL queries such as creating altering Tables, Indexes, Views also worked with PL/SQL stored procedures. Queried various tables to get resultant datasets as per the business requirements.
- Prepared ETL mapping documents explaining complete mapping logic.
- Prepared unit test document and performed unit testing, regression testing.
- Provided QA/UAT support while code promotion and worked with QA's to resolve any defects if found.
- Worked with the Release Management Team, the DBA Team, and the UNIX team for smooth code productions.
Environment: Informatica PowerCenter 9.6.1/9.5, Informatica Data Quality (IDQ), Oracle 11g, PLSQL Developer, Flat Files, MS Excel 2010, MS Visual Studio 2010, UNIX, WinSCP, MS Access, Autosys.
Confidential, Baltimore, MD
Sr Informatica Developer/MDM
- Interacted with the Data Analysis team to gather requirements and created Business Specification Documents. Analyzed the approach Documents/Templates and suggested a few fixes. Analyzed source to target mapping document.
- Worked on data cleansing, data Extraction, Transformation and Loading design, development, source to target mappings, data warehouse transformations, performance tuning, testing and documentation,
- Used Informatica Power Center to extract/transform and load data from different operational data sources like Oracle, Flat files like CSV files, XML files, Cobol files, SQL server from which customers data is coming and is loaded into Teradata warehouse,
- Installed & Configured MDM Hub on Dev, Test and Prod Server, cleanse, and Address Doctor in Dev, QA,
- Actively involved in implementing the land process of loading the customer/product Dataset into Informatica MDM from various source systems,
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM,
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store,
- Configured JMS message Queue and message triggers with SIF API,
- Configured Web services using SIF API Interface,
- Worked on complex mappings using transformations like Aggregator, Expression, Joiner, Filter, Sequence Generator, Connected and Unconnected Lookup, Dynamic Lookup, Filter, Router, Union, and Update Strategy using Informatica Power Center Designer,
- Implemented Type 2 Slowly Changing Dimension (SCD) to access the full history of accounts and transaction information,
- Implemented Change Data Capture (CDC) using the MD5 function,
- Worked on Informatica command line utilities like PMCMD, PMREP,
- Responsible for the performance tuning of the ETL process at source, target, mapping and session levels,
- Responsible for troubleshooting bottleneck problems by creating the indexes at database level,
- Created and configured workflows, worklets, and sessions using Informatica Workflow Manager,
- Made use of session log files and workflow log files to debug errors in mappings and sessions,
- Used Mapping Parameters and Variables for reusability of code snippets,
- Worked with version control tools like SVN and scheduler tools like autosys,
- Worked with Teradata utilities like Multiload, FastExport, and BTEQ scripts,
- Coded shell scrips in UNIX and used UNIX tools like WinSCP,
- Supported the testing team, UAT team and production support teams.
Environment: Informatica Power Center 9.1.1, Informatica MDM Multi-Domain, IDD, SIF, Oracle11g, Teradata V2R12, SQL Server 2008/R2, UNIX, Teradata utilities.
- Performed requirements gathering, developed Analysis & Design document (A&D), developed Project time line,
- Designed and developed the ETL Mappings for the source systems data extractions, data transformations, data staging, movement and aggregation,
- Developed standard mappings using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter for weekly, quarterly process to loading heterogeneous data into the data warehouse. Source files include delimited flat files, and SQL server tables,
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy, Union, SQL, Java and Sequence generator,
- Executed sessions, both sequential and concurrent, for efficient execution of mappings and used other tasks like event wait, event raise, email and command,
- Used Informatica for loading the historical data from various tables for different departments
- Developed Informatica mappings for Slowly Changing Dimensions Type 1 and 2,
- Created Mapplets for implementing the incremental logic, in identifying the fiscal quarter of the transaction and in calculating various requirements of business processes,
- Involved in the Unit & Integration testing of the mappings and workflows developed for the enhancements in the application,
- Code migration of Informatica jobs from Development to Test to Production. Performed Unit testing, Integration Testing, Job & Environment Parameters Testing along the way,
- Scheduled and ran Extraction and Load processes and monitored tasks and workflows,
- Tuned the MMW databases (stage, target) for better, faster and efficient loading and user query performance,
- Created Informatica mappings with PL/SQL stored procedures/functions to in corporate critical business functionality to load data,
- Extensively worked on performance tuning of mappings, sessions, Target, Source and various ETL processes.
Environment: Informatica 9.0.1, Oracle, DB2, Erwin, Oracle, SQL server, Shell scripting, TOAD.
- Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment,
- Involved in Analysis of the current system and Design solutions to create a centralized data warehouse as well as planning migration solutions from the current system to the new system,
- Used Informatica to extract data from different Source systems DB2 and Flat files then into Oracle Target system,
- Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse,
- Involved in Development and Data Integration of the main components using Informatica Power Center,
- Used different Transformations like Expression, Router, Filter, Sequence Generator, Stored Procedure, Connected and Un-connected Look ups etc.,
- Actively involved in the design and implementation of Physical and logical Data warehouses and Data marts, OLAP solutions, Multidimensional analysis,
- Developed logical models building hierarchies, attributes and facts extracting tables from warehouse,
- Actively involved in the scripting team for shell script to automate and migrate data from ODS to Data warehouse,
- Used the command line program PMCMD to communicate with the Informatica Server and for monitoring the workflow and tasks,
- Involved in performance tuning, monitoring and optimization of ETL loads.
- Involved in different team review meetings,
- Created the mapping and Functional Specifications,
- Managed software releases after each stage of the testing and defect removal. As well as resolving issues from UAT and System testing.
Environment: Informatica Power Center 8.6, Oracle 9i, Toad, Quest Central for DB2, UNIX, Windows NT, XML, Teradata, IBM DB2, MS Excel 97, Flat files, SQL, PL/SQL and UC4 Scheduling tool.