Etl Mdm Lead Developer Resume
NJ
SUMMARY
- Over 12years of IT experience in all aspects of software development and systems management - including analysis of project, development, deployment, testing, implementation and documentation in various industries such as Banking,Financial,Retail and HR.
- Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
- Good experience in data sources, data profiling and data Validation based on the business and functional requirements.
- Working knowledge of Dimensional Data Modeling, Star Join Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling using ERWIN and MS Visio.
- Highly experienced in data mart life cycle development and ETL procedure to load data from different sources into data marts and data warehouse using Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor. Extensively worked with Informatica mappings, sessions and workflows.
- Designing, Installing & Configuring core Informatica MDM Hub components,Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD& Data Modeling.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring.
- Experience in Real Time Data integration using Informatica Power Exchange Change Data Capture (CDC) technique.
- Worked on Datastage tools like Datastage Designer, Datastage Director and Datastage Administrator.
- Used BODS Repository manger, Designer, Management console. Extensively worked on Bods Dataflow, Workflows and Jobs.
- Extensively worked on mappings for Financial and banking Data Loads.
- Have significant experience in XML, XML schema and XML Generator.
- Extensive experience with Datastage Parallel Extender (EE) using different stages like Join, lookup, File Set, Change Capture, Filter, Funnel, Copy, Column Generator, Peek, Dataset, Sequential File, Oracle Enterprise, Merge, Transformer, Aggregator, Remove Duplicates, XML Input/output stages etc.
- Worked on Metadata management and Business Glossary in IBM Datastage 8.1.
- Having significant experience in using SAP IDOC Interpreter and Prepare transformation for data extraction.
- Used Sap BO XI R2,BO 6.x for universe creation and repot generation(Webi,Crosstable,Slice&Dice,Drill Up & Drill Down Etc)
- Have experience in writing UNIX Shell scripts.
- Extensively worked with Oracle, DB2, Sql Server and Sybase.
- Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.
- Extensively worked on Teradata MLoad,FLoad,TPump.
- SqlTuning for all the databases.
- Level 2 and 3 production support. Implemented migration and change tickets in production.
- Have strong analytical and communication skills. Excellent in oral and written communication skills.
- Good at evaluating business needs and architecting a solution including project time, cost, resource estimation, system design, specification etc.
TECHNICAL SKILLS
Data Warehousing: Informatica 5.x/6.x/7.x/8.x, 9.x(Informatica Power center/Power Exchange),Informatica IDQ/MDM,SAP Data services 3.2,Datastage 7.5,SAS
Databases: Oracle 7x/8x/9x/10g,11g, Sybase,DB2,Sql Server,Teradata12, 13x,Netezza 6x,7x
Database Tools: Toad, Quest Central, SQL plus,Sqldbx
Reporting Tools: Business Objects 6x,XI R2.
Utilities: MLOAD, FLOAD, BTEQ, SQL Loader
Data Modeling Tools: Erwin ERX 3.5, 4.0, Visio
Languages and Version Controls: SQL, PL/SQL, HTML, C, C++, Unix Shell Scripting and COBOL; XML, XSL and XSD; Microsoft Visual Source Safe 6.0
Operating Systems: Windows 9x/NT/2000/XP, UNIX IBM-AIX 5.1/4.1/3.2, Sun Solaris 2.6, MS-DOS 6.22.
PROFESSIONAL EXPERIENCE
Confidential, NJ
ETL MDM Lead Developer
Responsibilities:
- Process claims using Master Data Management initiatives using Informatica Master data management and data quality product
- Interact with system analysts to understand the requirements
- Created landing tables, base tables, staging tables according to the data model and number of source systems
- Defined the foreign key relationships among the base objects and defined look ups on the staging tables and packages, query groups and queries/custom queries
- Developed the mappings and used the necessary cleanse functions by analyzing the data using the data quality tool
- Concatenated the columns to get unique values loaded into the Pkey source of the staging tables
- Fixed the issues while getting the data loaded in stage and base tables and worked with Informatica Support.
- Defined the Trust scores for the source systems as per understanding the business process
- Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
- Analyzed and profiled the data and came with up the initial match rules and gone through several iterations for tuning matches.
- Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries
- Created batch jobs to load data to landing staging and base tables.
- Create low level design and ETL design doc according to requirement.
- Created customize sql code to enhance the process.
- Created sql scripts to remove redundant data from base tables.
- Created Unix script to execute sqlscript .
- The Process flow diagrams have been reversed from reviewing the ETL code in order to perform data quality measures.
- Performance tuning heavy queries and optimizing InformaticaMDM jobs.
- Work with developers to assure design and implementation conforms to internal standards, makes use of best practices and performs well.
Environment: Informatica MDM 9.5, Oracle 11g, E/R Studio 8.0, Putty, Linux, Subversion, Shell Scripts, Quest Toad, PL/SQL.
Confidential, NY
ETL Lead Developer
Responsibilities:
- Successfully completed Customer and Product centric Master Data Management initiatives using Informatica Master data management product
- Interact with data analysts to understand the requirements
- Utilized InformaticaIDQ to complete initial data profiling and matching/removing duplicate data.
- Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view, help control costs associated with mailing lists by preventing multiple pieces of mail
- Created landing tables, base tables, staging tables according to the data model and number of source systems
- Defined the foreign key relationships among the base objects and defined look ups on the staging tables and packages, query groups and queries/custom queries
- Developed the mappings and used the necessary cleanse functions by analyzing the data using the data quality tool
- Concatenated the columns to get unique values loaded into the Pkey source of the staging tables
- Fixed the issues while getting the data loaded in stage and base tables and worked with Informatica Support.
- Defined the Trust scores for the source systems as per understanding the business process
- Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
- Analyzed and profiled the data and came with up the initial match rules and gone through several iterations for tuning matches.
- Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries
- Analyze existing ETL jobs and find the gaps.
- Worked on project estimation, space requirement, validation of ETL process.
- Create low level design and ETL design doc according to requirement.
- Leading and mentor the efforts of rating staging area creation.
- Created Informatica jobs, sessions, workflows to load organization related fact and dimension tables.
- Implemented restart logic and error handling for the load process in Informatica.
- Created Parameter files and parameterized all jobs
- Created release notes and database scripts for production support and scheduling of jobs.
- Created Unix script to load flat file and parameter files.
- The Process flow diagrams have been reversed from reviewing the ETL code in order to perform data quality measures.
- Performance tuning heavy queries and optimizing InformaticaPowerCenter workflows and Stored procedures.
- Work with developers to assure design and implementation conforms to internal standards, makes use of best practices and performs well.
Environment: Informatica 9.5, Informatica MDM/IDQ 9.5, Oracle 11g, 10g, Netezza 7.2,Embarcadero E/R Studio 8.0, Putty, Linux, Microsoft Share point, Autosys scheduling, Shell Scripts, Quest Toad, PL/SQL.
Confidential, MN
ETL Lead Developer
Responsibilities:
- Worked with Enterprise Architecture Team to find the gaps in Actimize system from approach that meet overall enterprise architecture requirements.
- Leading and mentor the efforts of data quality completeness initiative for Informatica based ETL architecture.
- Performed assessment of the Actimize system in order to find whether ETL best practices were followed or not and submitted the recommendations of improvement.
- Designing architecture solutions for Hornet Data warehouse requirements to develop ETL solution.
- Initiated Design of ETL architecture solution from already existing ETL processes which are required to be analyzed as a part of the quality assessments.
- The Process flow diagrams have been reversed from reviewing the ETL code in order to perform data quality measures.
- Evaluated current data warehouse/ETL environment for usage in spend analytics solution deployment and provide gap assessment.
- Reverse Engineered ETL Database into Physical Data Models using ER/ Studio Tool.
- Source to Target Mappings based on reviewing the exiting code and data analysis.
- Created Data Flow Diagrams for ETL processes scheduled to run daily basis.
- Performed Data Quality assessment to know the hard coded values and records rejected and reference data.
- Generated Data profile reports using Data explore in Informatica 8.6.1. In order to understand source and target data which is processed thru ETL processes for Actimize.
- Connected to Universe to get the compliance data from underlying tables.
- Created and analyze Reports using actimize data and generated WEBI reports.
- Created UNIX scripts to format and schedule the config files and to create the comparison files.
- Performance tuning heavy queries and optimizing InformaticaPowerCenter workflows and Stored procedures.
- Work with developers to assure design and implementation conforms to internal standards, makes use of best practices and performs well.
Environment: Informatica power center/exchange 9.1, Informatica 8.6.1, Oracle 11g, 10g, Netezza 7.2,ER, SAP Business Object XI R2, Embarcadero E/R Studio 8.0, Putty, Linux, Microsoft Share point, MS SQL Server 2008 R2, Autosys Jobs scheduling, Informatica7.1, Metadata Manager,Datanomics, Shell Scripts, Teradata, Quest Toad, PL/SQL.
Confidential, NY
ETL Architect
Responsibilities:
- Analyze source and targets and take the requirement from the user
- Create ETL design doc, source to target mapping doc and Validation tables to log errors.
- Create Jobs to implement business rules and load files to target.
- Implemented DOMAIN validation to validate source column values datatypes and formats and log error in Domain validation tables
- Implemented VALIDATION rules to check to compare Trailer file column values matches actual data file column values and log error on validation table.
- Coordinating with ETL development team and fixing the ETL issues.
- Test the jobs extensively manually and create Unit and System Test cases document.
- Create Unix scripts to run the jobs and validate source file.
- Created SQL stored procedures to load to history tables.
- Level 3 production support .Handle service ticket for production support.
- Support the change migration and created the run book for support.
- Conducted code reviews and working sessions.
- Developed dataflow, workflow and jobs for loading data into Oracle tables.
- Created UNIX scripts to format and schedule the config files and to create the comparison files.
- Worked Extensively with Query, Map Opertion, SQL,Xml pipeline, Validation, Merge, Data Transfer, Row Generation and Pivot transformations.
- Extensively worked on Data services tools like - Designer, Manager and Repository manager.
- Tune mappings for better performance.
- Implemented patches for Sybase and added adapters to read XLS source.
- Retrieving data using Universes, Personal data files, stored procedures and SQL methods.
- Creating Complex and Ad-Hoc Reports using Business Objects.
- Design and Developed ETL best practices document (Development standards, File Naming, metadata, design strategy source format, mapping design etc, security, session best practices etc)
- Developed UNIX Shell scripts to implement the ETL load and integration management process.
Environment: SAP Data services (Designer and Manager),Excel, Oracle, Sybase and UNIX Shell Scripting. SQL stored procedure to load HIST tables. B.O R2 for reporting. Scheduling the job through UC4
Confidential, NY
ETL Informatica Developer
Responsibilities:
- Gathered Business User Requirements and wrote in Technical Design Documentation for projects.
- Prepared ETL design specification documents with information on implementation of business logic and specifications of the job flow.
- Coordinating with ETL development team and fixing the ETL issues.
- Conducted code reviews and working sessions.
- Level 3 production support.Handle ticketsto support production.
- Developed maps and workflow for loading data into Flat files and oracle tables.
- Created UNIX scripts to format and schedule the source files.
- Created and reviewed unit, system and integration test cases for all cycles.
- Worked Extensively with SAP IDOC interpreter, SAP IDOC Prepare transformations.
- Created complex mappings and also tuned for better performance.
- Used Informaticaworkflow manager to schedule Daily and Weekly Job.
- Coordinate with QA team and the Informatica development groups to support end-to-end testing in term of new release code from development to production.
- Uses oracle PL/SQL packages.
Environment: Informatica Power exchange 8.1,PowerCenter 8.6 (Mapping Designer, Workflow Manager and Workflow Monitor), Oracle, Sql server, SQL, PL/SQL, UNIX Shell Scripting
Confidential, NJ
Sr. Data Warehouse Developer
Responsibilities:
- Extensively involved in the current development processes of creating options Position feed for downstream, gathered Business User Requirements and wrote in Technical Design Documentation for projects.
- Prepared ETL design specification documents with information on implementation of business logic and specifications of the job flow.
- Coordinating with ETL development team and fixing the ETL issues.
- Conducted code reviews and working sessions.
- Level 3 production support. Handle tickets to support production.
- Developed maps and workflow for loading data into Flat files.
- Created UNIX scripts to format and schedule the source files.
- Created and reviewed unit, system and integration test cases for all cycles.
- Worked Extensively with Normalizer, XML, Expression, Lookup, Joiner, Sorter, Aggregator, Filter, Union and Router transformations.
- Extensively used Mapping Variables, Mapping Parameters, and Parameter Files in the mapping.
- Created BO universe to fetch data to create Webireports.
- Created various types of reports in using Business Objects functionality like cross tab, slice and dice, drill up, drill down, master detail.
- Used Autosys for Daily and Weekly Job scheduling.
- Coordinate with QA team and the Informatica development groups to support end-to-end testing in term of new release code from development to production.
Environment: InformaticaPowerCenter8.1/8.6 (Mapping Designer, Workflow Manager and Workflow Manager),Informatica Power Exchange 8.1, DB2, Oracle, Teradata,SQL, PL/SQL, UNIX Shell Scripting, Sybase,BO XI R2,Autosys schedulers