We provide IT Staff Augmentation Services!

Sr.informatica Developer/mdm Consultant Resume

2.00/5 (Submit Your Rating)

North Chicago, IL

PROFESSIONAL SUMMARY:

  • 8+ years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica PowerCenter 9.x/8.x/7.x. Informatica Data Quality(IDQ) 9.6.1, Informatica MDM and Informatica B2B
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience working with CloudComputing on Platform Salesforce.com
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Utility,Pharmaceutical,Health Care, Insurance, Financial and Manufacturing industries.
  • Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of teh transformation to cleanse and profile teh incoming data by using Standardizer, Labeler, Parser, Address Validator Transformations8 years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Proficient with Informatica Data Quality(IDQ) for cleanup and massaging at staging area.
  • Experience in creating batch scripts in DOS and Perl Scripting.
  • Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.
  • Experience in both Waterfall and AgileSDLC methodologies.
  • Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
  • Data modeling experience in creating Conceptual, Logical and Physical Data Models using ERwin Data Modeler.
  • Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Performed teh data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture teh changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignmentand Change data capture (CDC).
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge onTeradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Hands on experience in MDM development.
  • Involved in teh designing of Landing, Staging and Base tables in Informatica MDM.
  • Created MDM mapping and configured match and merge rules to integrate teh data received from different sources.
  • Optimized teh Solution using various performance-tuning methods (SQL tuning, ETL tuning (me.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Extensively used Autosys and Tidal for scheduling teh UNIX shell scripts and Informatica workflows.
  • Extensive knowledge in all areas of Project Life Cycle Development.
  • Stronganalytical, verbal, written and interpersonal skills.

TECHNICAL SKILLS:

Operating System: UNIX, Linux, Windows

Programming and Scripting: C, C++, Java, Python,.Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Specialist Applications & Software: Informatica Power Center/10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc.

Data Modeling (working knowledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.

Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology: Agile, Water fall.

Domain Expertise: Publishing, Insurance/Finance, HealthCare

Others (working knowledge on some): OBIEE RPD creation, OBIEE, ECM, Informatica Data TransformationXMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD.

RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

PROFESSIONAL EXPERIENCE:

Confidential, North Chicago, IL

Sr.Informatica Developer/MDM Consultant

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Worked closely with teh Data Integration team to perform validations both on teh Informatica MDM hub and Entity 360.
  • Experience Working with teh Address Doctor and its related Cleansing Functions
  • Experience working in Pharmacy and Providers Data which are coming from different data sources
  • Worked closely With teh Business and PDA Team as teh Data Stewards while performing teh Match and Merge Tasks
  • Communicated with business customers to discuss teh issues and requirements.
  • Designed, documented and configured teh Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing teh Land Process of loading teh customer/product Data Set into Informatica MDM from various source systems.
  • Worked on data cleansing and standardization using teh cleanse functions in Informatica MDM.
  • Developed ETL programs using Informaticato implement teh business requirements.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
  • Profiled teh data usingInformatica Data Explorer(IDE) and performed Proof of Concept forInformatica Data Quality(IDQ)
  • Used InformaticaPower Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce.
  • Migrated servers from 1and1 to AWSElastic Cloud (EC2), databases (RDS).
  • Refactored Java ETLcode to provide several new features such as redundancy, error handling, automation, image manipulation (SCALR), and teh addition of teh AWSJava SDK to handle teh transfer of files to S3.
  • Imported teh IDQ address standardized mappings into Informatica Designer as a Mapplets.
  • Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
  • Used relational SQL wherever possible to minimize teh data transfer over teh network.
  • Identified and validated teh Critical Data Elements in IDQ.
  • Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate teh objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation transformations to identify teh duplicate records.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.
  • Provided support and quality validation through test cases for all stages of Unit and integration testing
  • Created, Deployed & Scheduled jobs in Tidal scheduler for integration, User acceptance testing and Production region.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Used teh Teradatafast load utilities to load data into tables
  • Used SQL tools like TOAD to run SQL queries and validate teh data.
  • Converted all teh jobs scheduled in Maestro to Autosys scheduler as teh per requirements
  • Worked on maintaining teh master data using InformaticaMDM
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling teh jobs (work flows)
  • Performed tuning of queries, targets, sources, mappings, and sessions.
  • Used Linux scripts and necessary Test Plans to ensure teh successful execution of teh data loading process
  • Worked with teh Quality Assurance team to build teh test cases to perform unit, Integration, functional and performance Testing.
  • Used EME extensively for version control involved in code reviews, performance tuning strategies at ab initio and Database level.
  • Prepared teh error handling document to maintain teh error handling process and created test cases for teh mappings developed then created teh integrationtesting document.
  • Provided Knowledge Transfer to teh end users and created extensive documentation on teh design, development, implementation, daily loads, and process flow of teh mappings
  • Provided 24x7 production supports for business users and documented problems and solutions for running teh workflows.

Environment:Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE, CDC, MDM,Java, Linux, Perl, AWS,WINSCP, Shell, PL/SQL, Netezza, Teradata,Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio.

Confidential, Santa Clara CA

Sr. Informatica Developer / Analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using InformaticaMDM Hub console.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Involved in extracting teh data from teh Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in teh Dimensional Data Modeling and populating teh business rules using mappings into teh Repository for Data management
  • Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.
  • Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse, and load teh data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Created and configured workflows, worklets & Sessions to transport teh data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked on Database migration from Teradata legacy system to Netezza and Hadoop.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain teh database.
  • Worked with differentscheduling toolslike Tidal, Tivoli, Control M, Autosys.
  • Created Tivoli Maestro jobs to schedule Informatica Workflows.
  • Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading teh data in MDM and notify teh Data Stewards with all exceptions.
  • Worked on Autosys as job scheduler and used to run teh created application and respective workflow in this job scheduler in selected recursive timings.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling teh jobs.
  • Invoked Informatica using "pmcmd" utility from teh UNIX script.
  • Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
  • Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.
  • Involved in Production support activities like batch monitoring process in UNIX
  • Prepared Unit test case documents

Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MDM, IDQ, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.

Confidential, Columbus OH

Sr. ETL Developer / Analyst

Responsibilities:

  • Worked with Business Analysts (BA) to analyze teh Data Quality issue and find teh root cause for teh problem with teh proper solution to fix teh issue...
  • Document teh process dat resolves teh issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing teh Data model changes and other changes in teh Transformation logic in teh existing Mappings according to teh Business requirements for teh Incremental Fixes
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating teh rules.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
  • Defined teh Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing teh mappings to migrate teh data from source to target.
  • Used connected and Unconnected Lookup transformations and Lookup Caches in looking teh data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Developed Unix script to sftp, archive, cleanse and process many flat files
  • Created and ran Pre-existing and debug sessions in teh Debugger to monitor and test teh sessions prior to their normal run in teh Workflow Manager
  • Extensively worked in migrating teh mappings, worklets and workflows within teh repository from one folder to another folder as well as among teh different repositories.
  • Created Mapping parameters and Variables and written parameter files.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool(Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate teh data in warehouse.
  • Worked with teh SCM code management tool to move teh code to Production
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment:menformatica Power Center 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Confidential, Birmingham, AL

Sr. Informatica Developer/IDQ/MDM

Responsibilities:

  • Designing teh dimensional model and data load process using SCD Type 2 for teh quarterly membership reporting purposes.
  • Derived teh dimensions and facts for teh given data and loaded them on a regular interval as per teh business requirement.
  • Generating teh data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
  • Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading teh data onto teh staging and base object tables
  • Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded teh transformed data into targets in Oracle, Flat Files.
  • Wrote Shell Scripts for Data loading and DDL Scripts.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
  • Designing and coding teh automated balancing process for teh feeds dat goes out from data warehouse.
  • Implement teh automated balancing and control process which will enable teh control on teh audit and balance and control for teh ETL code.
  • Improving teh database access performance by tuning teh DB access methods like creating partitions, using SQL hints, and using proper indexes.
  • All teh jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
  • Worked on data cleansing and standardization using teh cleanse functions in Informatica MDM.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Mapped client processes/databases/data sources/reporting software to HPE’s XIX X12 processing systems(BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets dat provides reusability in mappings.
  • Analyzing teh impact and required changes to incorporate teh standards in teh existing data warehousing design.
  • Following teh PDLC process to move teh code across teh environments though proper approvals and source control environments.
  • Source control using SCM.

Environment:Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0.

Confidential, Rockville, MD

ETL Developer / Analyst

Responsibilities:

  • Involved in teh requirement definition and analysis support for Data warehouse efforts.
  • Documented and translated user requirements into system solutions; developed implementation plan and schedule.
  • Designed fact and dimension tables for Star Schema to develop teh Data warehouse.
  • Extracted teh data from Teradata, SQL Server, Oracle, Files, and Access into Data warehouse.
  • Created dimensions and facts in physical data model using ERWIN tool.
  • Used Informatica Designer to create complex mappings using different transformations to move data to a Data Warehouse
  • Developed mappings in Informatica to load teh data from various sources into teh Data Warehouse, using different transformations like Source Qualifier, Look up, Aggregator, Stored Procedure, Update Strategy, Joiner, Filter.
  • Scheduling teh sessions to extract, transform and load data into warehouse database on Business requirements.
  • Loaded teh flat files data using Informatica to data warehouse
  • Created Global Repository, Groups, Users assigned privileges Using Repository manager.
  • Setting up Batches and sessions to schedule teh loads at required frequency using Power Center Server Manager.
  • Handled common data warehousing problems like tracking dimension change using SCD type2 mapping.
  • Used e-mail task for on success and on-failure notification.
  • Used decision task for running different tasks in teh same workflow.
  • Assisted team member with their various Informatica needs.
  • Developed and maintained technical documentation regarding teh extract, transformation, and load process.
  • Responsible for teh development of system test plans, test case creation, monitoring progress of specific testing activities against plan, and successfully completing testing activities within teh requisite project timeframes.

Environment:Informatica Power Center 8.1, Erwin, Oracle 9i, UNIX, Sybase, MS SQL Server Windows 2000.

Confidential

Informatica Developer

Responsibilities:

  • Involved in analysis, design, development, test data preparation, unit and integration testing, Preparation of Test cases and Test Results
  • Coordinating with client, Business and ETL team on development
  • Developed Batch jobs using extraction programs using COBOL, JCL, VSAM, Datasets, FTP to Load Informatica tables
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying teh source and source data validation, developing logic, and transformation as per teh requirement and creating mappings and loading teh data into BI database.
  • Based on teh business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designing and developing ETL solutions in Informatica Power Center 8.1
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Most of teh transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files.
  • Developed Informatica mappings, enabling teh ETL process for large volumes of data into target tables.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in each SLA.
  • Created Workflow, Worklets and Tasks to schedule teh loads at required frequency using Informatica scheduling tool.
  • Ensured acceptable performance of teh data warehouse processes by monitoring, researching and identifying teh root causes of bottlenecks.
  • Acted as a liaison between teh application testing team and development team in order to make timely fixes.

Environment:menformatica Power Center 8.1, ETL, Business Objects, Oracle 10g/9i/8i, HP - Unix, PL/SQL.

We'd love your feedback!