We provide IT Staff Augmentation Services!

Cdw/mdm Support/integration Lead Resume

2.00/5 (Submit Your Rating)

BostoN

SUMMARY

  • 10+ Years of IT Experience in Business Intelligence, ETL, Data Quality, Data warehousing, Data Management, Project Management, Resource Planning and management, Solution Lead with Pharma, Retail, Energy and Utilities domain using Informatica Power Center, Informatica Data Quality, Informatica MDM, PL - SQL & Tidal. Key Roles and responsibilities are below
  • Roll out an enterprise wide data governance framework, with a focus on improvement of data quality and the protection of sensitive data through modifications to organization behavior policies and standards, principles, governance metrics, processes, related tools and data architecture
  • Serve as a liaison between Business and Functional areas and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated and well understood and considered as part of operational prioritization and planning
  • Facilitate the development and implementation of data quality standards, data protection standards and adoption requirements across the enterprise
  • Define indicators of performance (KPI) and quality metrics and ensure compliance with data related policies, standards, roles and responsibilities, and adoption requirements
  • Assist the team, comprising resources from the Business and Functional areas and IT business and operations functions, to achieve their objectives; resolve issues escalated from Business and Functional areas data governance representatives
  • Coordinate with external data sources to eliminate redundancy and streamline the data quality services

TECHNICAL SKILLS

Big Data Technologies: Hadoop - (HDFS, Hive & Pig) - Trained, AWS

Programming Languages: SQL, PL/SQL, Core JAVA, Shell Scripting

Development Tools: Toad, Putty, SQL data loader, SOAP UI

Database: Oracle, SQL server, TERADATA

ETL, DQ, MDM Tool: Informatica Power Center (Advanced), Informatica Data Quality (Developer), Informatica MDM (Developer)

Scheduling Tools: Tidal, Autosys

ITSM Tools: Service Now, Remedy

Servers: Linux, UNIX & Windows

PROFESSIONAL EXPERIENCE

Confidential - Boston

CDW/MDM support/Integration Lead

Responsibilities:

  • Strong consulting and system implementation skills including: requirements/process analysis, design, configuration, testing, training, change management, and production support (ETL, IDQ, Data Migration and Master data management)
  • Provide application support for 14 applications including the complex ones such as Customer master, commercial data warehouse and Aggspend systems.
  • Participate in business user meetings along with client IT leads to understand the ongoing business challenges and propose them right solution in the existing application support model.
  • Perform impact analysis on data issues and data integration due to new project deployments and collaboration with third party vendors for successful project go live.
  • Contribution to platform upgrades and application software upgrades by doing end to end testing validation along with project team.
  • Participate in review meetings to validate support deliverable metrics and KPI s for various parameters such as Incidents, change requests, Problem tickets, service requests, configuration management etc.
  • Will be involved in working along with multiple vendors in time bound activities and will be working on maintenance activities that includes critical production systems on which entire business is relied on.
  • Technical tools and technologies include Informatica powercenter, Data Quality, MDM, Oracle PL-SQL, Tidal scheduler, LINUX, SAS is used to carry out technical work

Confidential

Data Migration Lead

Responsibilities:

  • Has designed, developed & tested the technical architecture and Data Quality, batch framework to adopt data migration to SAP using INFORMATICA.
  • Played Lead role and managed the team from offshore.
  • This is a global implementation to be used for 46 countries across the globe.
  • Preparation of data migration templates with client team key users
  • Has designed, developed & tested the technical architecture and batch framework to adopt data migration to SAP using Informatica Powercenter and Data Quality tool
  • Data quality solution using IDQ tool
  • Responsible for custom specific data migration from legacy SAP system to GSI SAP system using INFORMATICA power Center tool
  • Requirements fitment Analysis and Gap Analysis
  • Effort Estimation Calculations

Confidential

Information Management Sr. Developer

Responsibilities:

  • Review of FSD (Functional Specification Documents), TSD (Technical Specification Documents)
  • High and low-level design document preparation based on the requirement gathering document and review session information
  • ETL workflow creation, modification and testing
  • Code migration and code review for the team members and peers
  • Data Integration enhancement and production support activity.
  • Code migration and testing for platform and software version upgrade
  • UAT support and assistance
  • Post go-live support and main POC
  • Modules and touch points which was required to deliver/manage the project deliverables within the track.
  • Quantum warehouse data support includes Customer, sales, call planning, Alignment, Targeting data deliverable s to various external vendors Merkle, Dendrite etc. and internal warehouse team.
  • Our client Sanofi-Aventis acquired a new organization "Genzyme". So, they wanted to integrated the customer and sales data with the existing "QUANTUM" data warehouse. So, we created explicit interfaces to integrate the sales and customer a in to quantum warehouse
  • Our Client Sanofi-Aventis was using large data warehouse with OS as HP UNIX, database as ORACLE 10g, and ETL tool INFORMATICA with version 8.5.1.
  • To handle the upcoming high data volume, process performance, and high availability, it was decided to upgrade both hardware and software (OS to Red hat LINUX, database to ORACLE 11g, and ETL tool INFORMATICA with version 9.1).
  • Our job was mainly to migrate the code to the upgraded platform, testing and applying code fixes wherever applicable and required without affect the functional business cases.

Confidential

Information Management Developer

Responsibilities:

  • Understanding the functionality, Develop and enhance the PowerCenter Mapping & workflows to load data from Relational tables to Fixed/De-limited Flat files.
  • Design and implementation of AUTOSYS (JIL Scripts) jobs
  • ETL Framework and Metadata implementation
  • Involved in data extraction from XML using ORACLE stored procedure and loading in relational tables. Also, enhancement/modification of UNIX shell scripts as per process requirement
  • Understanding the functionality, Develop & enhance the Mapping & workflows to load data from Relational tables to XML files
  • Understanding the business requirements and the preparation functional test scripts.
  • Involved in the execution of ETL interfaces related to Data purge during the functional test script execution
  • Developed the test scripts and testing in various phases like System testing, SIT, UAT etc.
  • Documented and managed defects & issues related to the testing of the designed solutions/Bug fixing and RCCA (Root Cause Corrective Action).
  • Involved in the DP interface execution for loading data in one of the sales tool application. Also worked in Test / UAT phase for the new DP interface. Defect logging & fixing related to the ETL interface
  • Data model understanding for Siebel EIM and Base tables. Data analysis in Siebel EIM and BASE tables during data issue.

Confidential

ETL Developer

Responsibilities:

  • Preparation of Data Dictionary and Data mapping
  • Inbound Interface implementation includes
  • Development and testing of Source to Stage Mapping.
  • ABAP Code Generation for SAP sources in Mappings.
  • ETL Framework and Metadata implementation.
  • STG2CORE Loading using Generic Proc.
  • Outbound Interface implementation includes
  • 'Cur' view implementation to bring Active Records.
  • Semantic to Outbound (EDW Staging) mapping implementation.
  • Job Scheduling implementation

We'd love your feedback!