Senior Etl & Mdm Developer Resume
SUMMARY
- Experienced MDM&ETL Developer with 14 plus years of experience in the Information Technology industry and technical proficiency in the data warehousing, business intelligence, business requirements analysis, Design, Data Modeling, Development, testing and documentation.
- Possess specific experience in the Financial, Sales, Manufacturing, Insurance, Education, Recruiting and Oil industries.
- Expertise in Informatica MDM Hub Match and Merge rules, Batch Jobs and Batch Groups.
- Expertise in Informatica MDM 9.X or 10.X, MDM Hub & IDD.
- Hands on experience with Informatica MDM Hub configurations, Data Mappings (Landing, staging and Base objects), Data validation, Match and Merge rules, customizing/configuring Informatica data director (IDD) applications
- Define and configure landing tables, staging tables, base objects, lookups, query groups, queries, custom queries, packages, hierarchies and foreign key relationships
- Excellent knowledge of performance evaluation and change management principles.
- Experience in integrating external business application with Informatica MDM hub using Batch process, SIF and message queue .
- Experience with the Basic Models like supervised and Unsupervised - by using tools like python 3.6/Java/IDQ.
- Possess experience working with multiple ETL tools like Informatica, pervasive Data Integrator.
- Possess specific experience in designing and developing of ETL mappings, maplets transformations, sessions and workflows in Informatica Power Center, Informatica Data Quality.
- Extensively worked with Power Exchange of Sales force to pull the data and load the data into Sales force.
- Worked in various Heterogeneous Source Systems like Oracle, Teradata, MS SQL Server, Flat files and Legacy systems.
- Great Expertise in using Exception Handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.
- Practical understanding of the Data modeling (Dimensional & Relational concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Experience in performance tuning of database and Informatica processing
- Extensively created mapplets, common functions, reusable transformations, look-ups for better usability.
- Proficient in SQL and PL/SQL and UNIX Shell Scripts
- Experience in Oracle, SQL Server, Teradata databases
- Possess knowledge on Teradata (V2R5,12) Utilities like BTEQ,FastLoad,Multiload,Tpump and Queryman.
- Possess experience in Reporting Tools Business Objects -BOXI R2 and Siebel Analytics -OBIEE.
- Expertise in Informatica and DAC Scheduling Tools.
- Expertise in best practices and development standards for Informatica implementation.
- Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP
- Experience in supporting the architecture and design of the data warehouse, data marts and BI reports
- Delivered all the projects/assignments within specified timelines.
- Ability to work independently or in a team environment or in a rapid pace environment.
- Possess strong interpersonal skills, communication and presentation skills with the ability to interact with people at all levels
TECHNICAL SKILLS
Hardware: IBM/PC, Intel Pentium, Apple Macintosh, HP, Sun Solaris Workstation.
Operating Systems: Windows NT/2000/XP/W10, UNIX
Methodology: waterfall, Agile.
Methodology Tools: Jira, Confluence, Bamboo, GitHub, TFS, Version One.
Languages: C, UNIX, SQL, PL/SQL, BTEQ, Shell Scripting
Databases: Oracle, SQL Server, Teradata12, MongoDB
Tools: Informatica, IDQ 10.x/9.x/8.x, Master Data Management 10.1/10.2, BOXI R2, BODI, Pervasive Data Integrator, DAC, OBIEE, MLOAD, FLOAD, TPUMP, Erwin, Eclipse IDE.
Software: SQL*Loader, Main Frames, TOAD, Micro Strategy, Remedy, Rational Tools -Clear Quest, Service now, Clear Case, Salesforce.com, Force.com, Workbench, JAVA, XML, Python, Machine Learning, AI.
PROFESSIONAL EXPERIENCE
Confidential
Senior ETL & MDM Developer
Responsibilities:
- Gather requirements from the Business Analyst.
- Doing Analysis on the Source Data by doing Profiling using Informatica Analyst.
- Based on the Patterns of the sources - creating rules and include for the Quality Metric process.
- Involved in creation of Model for the Domain of Account/Customer/Person. - Party Model.
- Involved in Tokenization and Trust Configuration of the System.
- Involved in creating Staging/BO tables and creating a relationships between the entities.
- Involved in creating a match rules and configuration of each BO based on the business functionality.
- Used Extensively Segment and tuning matching.
- Defined Match Rulesets based on the requirement and tuning the matching and analyzing the Tokens.
- Involved in creation of mappings for staging and loading the data into Staging tables and BO using IDQ.
- Created multiple Batch Groups to load the data into BO.
- Involved in creating of IDD Subject Areas/Child Subject Areas and Cleanse Function.
- Involved in configuring Business Entity and Business Entity Views and Transformation for entity360 views.
- Review the server Logs and Cleanse Logs regularly to monitor the real time and batch groups jobs.
- Designed the Dataflow of Dun Bradstreet in the real-time for both process - Enrichment/Monitoring.
- Developed the Process of enrichment and Monitoring by using wsdl services and Xml files feed from Dun’s&BradStreet and ingesting into BO.
- Involved in the both the projects like Siebel to MDM 10.1 Migrations (Single Domain) and MDM10.1 to MDM10.2 Migrations (Party Model - MultiDomain).
- Designed and developed the IDL extracts from 10.1 to 10.2 Upgrades using IDQ.
- Developed lot of Classes in Java (1.7/1.8) based on the requirement - like HubUserExits and IDDUserExits.
- Develop lot of custom java classes to derive the cleanse function and used in the project.
- Develop the Custom web Applications for the Data Stewards using Java/Spring Framework.
- Using JSON form of the data to inbound Accounts from Salesforce and Outbound with SIF Services.
- Used Extensively IDQ (9.6.1/10.2) to develop mappings/map lets/Workflows/Applications to fulfill the requirement of Full Load and Incremental load extracts from different Source Systems like PeopleSoft/Siebel/Salesforce.
- Involved in creating lot of rules to cleanse the Party Name/Phone No/Email Validations and used as batch and real-time.
- Involved in Creation of logging exception whenever the Duns Lookup Service didn’t met the criteria like matchDataProfileText and Match Percentage didn’t met the Standards.
- Exposing the Exceptions through GUI using Informatica Analyst to take appropriate actions.
- Extensively worked with Dun’s&BradStreet Enrichment and Monitoring services of Detailed Company Profile and Corporate Linkage with Upward Family Tree and Full Family Tree License.
- Designed and developed the process of enrichment and Monitoring by landing the data into relational tables for debugging and know the status if anything goes wrong.
- Develop lot of services using Address Doctor to fulfill the requirement of business needs.
- Involving in the developing the Customer/Person data ingestion to the Party Model.
- Using Different Models to take Decision whether to merge the records or not - Like probabilistic/Classifier - using informatica IDQ.
- Deploying the applications to different environments and supporting them in production.
- Following Agile Methodology to meet the business requirements with the team of PO/Scrum Master.
Environment: - Informatica PC(10.2) and Informatica Data Quality tools, Oracle,Unix,Salesforce.com,Force.com,Workbench,MDM10.1/10.2,JSON,SifServices,WSDL,REST Services, Java, Spring Framework, Unix, Dun’s&BradStreet Integration, Machine Learning,Servicenow,Version One, JIRA, Agile Methodology.
Confidential
Senior ETL Developer
Responsibilities:
- Gather requirements from the Business Analyst.
- Converting the functional requirements into technical specification documents.
- Involved in creation of Mappings and Sessions to generate the reports of TBAR.
- Involved in creation of mappings for staging and loading the data into the dimensions and fact tables.
- Involved in developing the mappings using source and Target as Sales force.
- Utilized all the options available in Informatica of Sales force like Query All (), CDC SOQL language to pull the data from the sales force .
- Utilized the options of salesforce transformations to insert/update/upsert data into sales force.
- Extensively used Force.com and Workbench to query the salesforce source.
- Used reusable transformation and created shortcuts.
- Involved in Initial loads, Incremental loads; Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
- Extensively worked in the performance tuning of SQL, ETL and other processes to optimize session performance.
- Involved in writing stored procedures, functions based on the requirements.
- Implemented SCD Type I, II based on the requirement.
- Created sessions, other tasks and work flows according to the requirement
- Design Restart ability functionality for each mapping.
- Worked extensively with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Created Reusable transformations, Maplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Extensively used mapping parameters and variables based on the requirement
- Involved in creation of report for each Business by using Reporting Tools like SSRS.
- Involved in writing UNIX scripts,Pl/sql Procedures based on the Business requirement.
- Involved in writing and testing unit test cases of mappings, sessions and workflows.
- Effectively used Target load plan option of informatica designer.
- Involved in creation of Mappings using reusable rules of Data Quality.
- Extensively analyzed the data patterns of Source data using Data Profiling.
- Gather requirements for Accounts, Contacts and Prospect data.
- Installing and Configuring IDQ within CRM environments.
- Prospect data cleansing and de-duplication techniques within IDQ for Accounts and Contacts.
- Effectively used IDQ transformation like Match/Address validation other transformation.
- Expertise in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
- Involved in the deployment of objects from Dev to Prod Environment.
- Involved in post production issues and delivered all assignments/projects within specified time lines.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: - Informatica power center (9.6.1/9.1) and Informatica Data Quality tools, SqlServer 2008,Oracle,Unix , Salesforce.com,Force.com,Workbench,Agile Methodology.
Confidential
Software Engineer
Responsibilities:
- Responsible for Impact Analysis, upstream/downstream impacts.
- Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Created Objects as per standards to the business by following Naming conventions
- Created Complex Mapping by using Mainframe Source files and loaded into Relational table
- Extracted data from flat files and relational sources into relational targets by creating mappings
- Involved in Initial loads, Incremental loads, Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
- Extensively worked in the performance tuning of SQL, ETL and other processes to optimize session performance.
- Involved in writing stored procedures, functions based on the requirements.
- Implemented SCD Type I, II based on the requirement.
- Created sessions, other tasks and work flows according to the requirement
- Design Restart ability functionality for each mappings.
- Worked extensively with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Extensively used mapping parameters and variables based on the requirement
- Involved in creation of report for each Business by using Reporting Tools like SSRS.
- Involved in writing UNIX scripts,Pl/sql Procedures based on the Business requirement.
- Involved in writing and testing unit test cases of mappings, sessions and workflows
- Implemented the scheduling of workflows by using Informatica.
- Expertise in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
- Involved in the deployment of objects from Dev to Prod Environment.
- Involved in post production issues and delivered all assignments/projects within specified time lines.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: - Informatica (8.6), SqlServer 2008,SSRS, Mainframes, Rational Tools(clear case/clear quest),Water fall Methodology