- 8+ years of diversified information technology experience in the field of Software Development, Project Management, Data Warehousing, Data Integration, Informatica MDM, ETL and Application/Production Support.
- Experience in Mater Data Management (MDM) Multi Domain Edition HUB Console Configurations, Informatica Data Director (IDD) Application creations.
- Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
- Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
- Experienced in using Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling / IDQ Developer client, applying rules and develop mappings to move data from source to target systems
- Experience in Address Validator transformation extensively for North America Address validations.
- Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files.
- Configured the Smart Search and Entity 360 in IDD for Product MDM.
- Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
- Played a key role in creating / managing the Data Quality Process and in the development of DQ Rules, Profiles, Profile Models and Scorecard for various business requirements.
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.
- Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
- Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
- Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non - windows environments
- Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
- Experience in complex quality rule and index design, development and implementation patterns with cleanse, parse, standardization, validation, scorecard, exception, notification and reporting with ETL and Real-Time consideration
- Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
- Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts, Linux Scripts and scheduling tool (Control-M v7/v8), CA WA Workstation (ESP).
- Experience in configuring ActiveVOS for Managers/Data Stewards team approval.
- Extensive knowledge of health information and health care services regulatory environment including HIPAA, Medicaid/Medicare, and EDI.
- Designed Applications according to the customer requirements and specifications.
- Experience in using Automation Scheduling tools like Autosys, Tidal,Control-M,Tivoli Maestro scripts.
- Excellent Interpersonal and Communication skills coupled with strong technical and problem-solving capabilities.
- Excellent analytical, problem solving, technical, project management, training, and presentation skills.
Operating System: UNIX, Linux, Windows
Programming and Scripting: C, C++, Java, Python,.Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.
Specialist Applications & Software: Informatica Power Center/ 10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1 , Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, IDD,Salesforce, DataStage, etc.
Data Modeling (working knowledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.
Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN
Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.
Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)
Software Development Methodology: Agile, Water fall.
Domain Expertise: Publishing, Insurance/Finance, HealthCare
Others (working knowledge on some): OBIEE RPD creation, OBIEE, ECM, Informatica Data Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD.
RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008 , MS Access 7.0/2000.
Sr. Informatica Developer / MDM Consultant
- Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
- Worked closely with the Data Integration team to perform validations both on the Informatica MDM hub and Entity 360.
- Experience Working with the Address Doctor and its related Cleansing Functions.
- Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Created Queries & Packages to view the data in IDD, Data Manager or Merge Manager.
- Configured Entity 360 view to display the data of the root node of the composite objects.
- Configured IDD and Used Entity 360 to display the records that matches the entity.
- Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Provided Business Users with “Send to Oracle” feature in IDD and Entity 360 application utilizing smart search capability with User Exit and Enterprise Service Bus Integration component with JMS queue configuration and webservices interaction using SIF
- Configured Active VOS BPM with MDM IDD for One Step Approval Workflow process for Product (style hierarchy) approval by Managers/Data Steward team.
- Develop External Applications (File Upload, User Management) using SIF APIs and integrated in IDD.
- ActiveVOS integration with IDD and Hub.
- Experienced in Installing, Configuring, Upgrading and Administering Complete Informatica MDM Hub/Process Server /Active VOS /and IDD on various versions like 9.x, 9.7 and 10.2 (latest version).
- Experienced in creating, configuring and registering (Administering) system & Operational ORS's for MDM Hub & Cleanse Server(s) /Process Server(s).
- Experienced in Installing, Configuring, Upgrading and Administering Web Logic & JBoss app servers.
- Performed Extensive Performance tuning on JVM with respect to Web Logic & JBoss and MDM application.
- Experienced in complete end-to-end MDM upgrade from 9.x to9.7 and 10.2 (latest version).
- Experienced in installing & configuring Address Doctor Engine-reference Data on MDM / IDQ/IPC.
- Worked on ActiveVOS human task for IDD approval process.
- Migrated servers from 1and1 to AWS Elastic Cloud (EC2), databases (RDS).
- Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
- Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
- Created, Deployed & Scheduled jobs in Tidal scheduler for integration, User acceptance testing and Production region.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
- Used the Teradata fast load utilities to load data into tables.
- Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements
- Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
- Performed tuning of queries, targets, sources, mappings, and sessions.
- Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process
Environment: Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE,IDD, CDC, MDM,Java, Linux, Perl, AWS,WINSCP, Shell, PL/SQL, Netezza, Teradata,Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio.
Confidential, Colombia, MD
Sr. Informatica Developer / Analyst
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
- Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
- Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
- Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.
- Created and configured workflows, worklets & Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
- Developed Informatica Data Director (IDD) applications and queries used for Data Steward Analysis.
- Lookup tables were configured in the IDD applications.
- Created Queries and performed the Manual match and Merge operations in IDD.
- SIF API was created to access Master Data Management (MDM) Hub from External Application.
- Hands on experience on HIPPA Transactions like 270, 271, 272, 273, 274, 275, 276, 277, 834, 835, 837 etc.
- Worked on Database migration from Teradata legacy system to Netezza and Hadoop.
- Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
- Developed and implemented EDI applications to process Health Care transactions as per the HIPAA
- Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
- Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys.
- Created Tivoli Maestro jobs to schedule Informatica Workflows.
- Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ.
- Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
- Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
- Worked on Autosys as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.
- Generated PL/SQL and Shell scripts for scheduling periodic load processes.
- Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
- Invoked Informatica using "pmcmd" utility from the UNIX script.
- Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
- Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.
- Involved in Production support activities like batch monitoring process in UNIX.
Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MDM, IDQ,IDD, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.
Sr. ETL Developer / Analyst
- Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue...
- Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues
- Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
- Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
- Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
- Created Informatica workflows and IDQ mappings for - Batch and Real Time.
- Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
- Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
- Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality.
- Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing the mappings to migrate the data from source to target.
- Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
- Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
- Involved in doing Unit Testing, Integration Testing, and Data Validation. implementation. Wrote SQL Queries for Back End Testing.
- Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
- Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
- Created Mapping parameters and Variables and written parameter files.
- Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
- Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool (Autosys).
- Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
- Worked with the SCM code management tool to move the code to Production
- Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.
Environment: : Informatica Power Center 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.
Sr. Informatica Developer/IDQ/MDM
- Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
- Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
- Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
- Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
- Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
- Wrote Shell Scripts for Data loading and DDL Scripts.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
- Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
- Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
- Involved in HIPAA/EDI Medical Claims Analysis, Design, Implementation and Documentation. Involved in HIPPA Complaint X12N837 Transaction testing
- Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
- All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
- Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
- Mapped client processes/databases/data sources/reporting software to HPE’s XIX X12 processing systems (BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
- Analyzing the impact and required changes to incorporate the standards in the existing data warehousing design.
- Following the PDLC process to move the code across the environments though proper approvals and source control environments.
- Source control using SCM.
Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0.