- 7+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQLServer databases, SSIS, InformaticaPowerCenter 9.x/8.x/7.x, Informatica Data Quality, etc
- Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
- Experience in installation and configuration of core InformaticaMDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.
- Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
- Extensively worked with complexmappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
- Experienced in using Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling / IDQ Developer client, applying rules and develop mappings to move data from source to target systems
- Experience on creating Column Profiling, Rule Profiling, Mid - stream Profiling, Score Cards, Decision Task, Reference Tables, DQ Tables, Notification task as a part of workflow and deploying them.
- Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets and vice versa
- Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
- Proficient in AWS IAM security framework - creating roles, policies, groups, users, network ACLs, using security groups with inbound / outbound rules
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
- Worked on integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
- Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C, SQL*LOADER, C, Data structures using C, Unix scripting, Python scripting and Perl scripting.
- Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
- Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle .
- Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments
- Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
- Implemented change data capture (CDC) using Informaticapower exchange to load data from clarity DB to TERADATA warehouse.
- Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
- Experience in Data profiling and Scorecard preparation by using Informatica Analyst.
- Experience in Migrating Data from Legacy systems to Oracle database using SQL*Loader
RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008 , MS Access 7.0/2000.
Programming and Scripting: C, C++, Java, Python,.Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.
Specialist Applications & Software: Informatica Power Center/ 10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1 , Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc.
Data Modeling: Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.
Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN
Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.
Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)
Software Development Methodology: Agile, Water fall.
Domain Expertise: Publishing, Insurance/Finance, HealthCare
Others: OBIEE RPD creation, OBIEE, ECM, Informatica Data Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD.
Sr. Informatica Developer/MDM Consultant
Confidential - North Chicago, IL
- Worked closely with the Data Integration team to perform validations both on the Informatica MDM hub and Entity 360.
- Experience Working with the Address Doctor and its related Cleansing Functions
- Experience working in Pharmacy and Providers Data which are coming from different data sources
- Worked closely With the Business and PDA Team as the Data Stewards while performing the Match and Merge Tasks
- Communicated with business customers to discuss the issues and requirements.
- Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data. Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Developed ETL programs using Informatica to implement the business requirements.
- Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ)
- Used InformaticaPower Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce.
- Migrated servers from 1and1 to AWS Elastic Cloud (EC2), databases (RDS).
- Refactored Java ETL code to provide several new features such as redundancy, error handling, automation, image manipulation (SCALR), and the addition of the AWS Java SDK to handle the transfer of files to S3.
- Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
- Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
- Used relational SQL wherever possible to minimize the data transfer over the network.
- Identified and validated the Critical Data Elements in IDQ.
- Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation transformations to identify the duplicate records.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2
- Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.
- Created, Deployed & Scheduled jobs in Tidal scheduler for integration, User acceptance testing and Production region.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
- Used the Teradatafast load utilities to load data into tables
- Used SQL tools like TOAD to run SQL queries and validate the data.
- Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements
- Worked on maintaining the master data using InformaticaMDM
- Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
Environment: Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE, CDC, MDM,Java, Linux, Perl, AWS,WINSCP, Shell, PL/SQL, Netezza, Teradata,Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio.
Confidential - Columbus, OH
Sr. ETL Developer/Analyst
- Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power centre.
- Responsible for Requirement Gathering Analysis and End User Meetings.
- Responsible for converting Functional Requirements into Technical Specifications.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2, Sybase, MS Access and Flat Files.
- Worked on the Informatica Analyst tool for profiling the data and perform validations.
- Developed Re-Usable Transformations and Re-Usable Mapplets.
- Used various transformations like Lookup, Filter, Normalizer, Joiner, Aggregator, Expression, Router, Update strategy, Sequence generator and XML Generator Transformations in the mappings.
- Developed mappings for Slowly Changing Dimensions of Type1, Type2, Facts and Summary tables using all kinds of transformations.
- Extensively worked on UNIX Shell scripting.
- Played a Major role in Informatica Upgrade from 9.1 to 9.6 versions, such as migrated ETL process and Control-M scheduled jobs from current instance to new instance by holding current jobs on-ice.
- Published and consumed Web Services using REST and deployed it on WebLogic Server.
- Developed Web services for the services to get the data from external systems to process the request from client sides.
- Connected SAP systems like HANA & CAR for extraction and load by using Informatica9.6.1 and also hands on experience on using Business Application Programming Interface (BAPI) and RFC functions while processing the data.
- Worked on storage management by applying approach Information lifecycle management (ILM) and Credit Risk.
- Worked on Production support to fix the issues for ETL loads on a weekly basis
- Extensively used transformations like connected/unconnected lookup, XML, Web Services, HTTP, filter, update strategy, router, joiner, expression, and stored procedure, sequence generator, Normalizer and filter, keygen, labeler, address validators, match, expression and filter transformations in Developer tool(IDQ)
Environment: Informatica PowerCenter 9.5.1, SQL Server 2012/2008, T-SQL, Oracle 11g Exadata SQL Server management studio, DB2, stored procedures, XML, XML spy, Shell Scripts, UNIX, TFS, Quality Center, HP Service Manager, ER Studio, Control M and AutoSys scheduling tool.
Confidential - Birmingham, AL
Sr. Informatica Developer/IDQ/MDM
- Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
- Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
- Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
- Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
- Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
- Wrote Shell Scripts for Data loading and DDL Scripts.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
- Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
- Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
- Hands on experience on HIPPA Transactions like 270, 271, 272, 273, 274, 275, 276, 277, 834, 835, 837 etc.
- Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
- All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
- Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
- Mapped client processes/databases/data sources/reporting software to HPE's XIX X12 processing systems(BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
- Analyzing the impact and required changes to in corporate the standards in the existing data warehousing design.
- Following the PDLC process to move the code across the environments though proper approvals and source control environments.
Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0.
ETL Developer / Analyst
- Involved in the requirement definition and analysis support for Data warehouse efforts.
- Documented and translated user requirements into system solutions; developed implementation plan and schedule.
- Designed fact and dimension tables for Star Schema to develop the Data warehouse.
- Extracted the data from Teradata, SQL Server, Oracle, Files, and Access into Data warehouse.
- Created dimensions and facts in physical data model using ERWIN tool.
- Used Informatica Designer to create complex mappings using different transformations to move data to a Data Warehouse
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Look up, Aggregator, Stored Procedure, Update Strategy, Joiner, Filter.
- Scheduling the sessions to extract, transform and load data into warehouse database on Business requirements.
- Loaded the flat files data using Informatica to data warehouse
- Created Global Repository, Groups, Users assigned privileges Using Repository manager.
- Setting up Batches and sessions to schedule the loads at required frequency using Power Center Server Manager.
- Handled common data warehousing problems like tracking dimension change using SCD type2 mapping.
- Used e-mail task for on success and on-failure notification.
- Used decision task for running different tasks in the same workflow.
- Assisted team member with their various Informatica needs.
- Developed and maintained technical documentation regarding the extract, transformation, and load process.
- Responsible for the development of system test plans, test case creation, monitoring progress of specific testing activities against plan, and successfully completing testing activities within the requisite project timeframes.
Environment: Informatica Power Center 8.1, Erwin, Oracle 9i, UNIX, Sybase, MS SQL Server Windows 2000.