We provide IT Staff Augmentation Services!

Data Management Lead Resume

Cary, NC

SUMMARY:

  • Have about 15+ years of consulting experience as Data Architect driving work streams focusing on, but not limited to, data standards, data integration, data architecture, data quality management, metadata, and data semantics, data loading, data analytics for BI & data governance.
  • Well versed with end to end Data Management Cycle from Data Governance, Data Quality, Master Data Management, Data Integration and Reporting layers.
  • Significant business knowledge in banking, manufacturing, healthcare and the service industry.
  • Experience in developing Strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica Power Center, Confidential information server (Data Stage) and MS SSIS tools.
  • Good knowledge on Big data eco systems with tools and techniques.
  • Experience in migrating the data from legacy systems into data lake platforms.
  • Responsible for maintaining overall data platform by working with admins in updating the patches and maintenance cycles.
  • Working with delivery product manager in updating the user stories for the data services area and updating the JIRA with appropriate information for the sprints.
  • Extensive experience on information management product suite from Data Governance, Data Quality, Data Integration and Reporting.
  • Strong Knowledge in ETL methodology for supporting data extraction, loading and Transformation processing.
  • 10+ years of experience in information Management Tools like Infosphere Data Architect, Information Analyzer, Quality Stage, Information Server (Data Stage), Business Glossary, Metadata Workbench, Informatica Power Center etc.
  • Extensive experience as a solution architect for business information systems, focusing on database architecture, data modeling, data analysis, software design, programming and application integration.
  • Strong “hands - on” knowledge with application development, DW platforms and databases (Oracle 10g, Teradata, MS-SQL, Sybase, dBase): data warehousing, application software SDLC and business reporting.
  • In-depth knowledge of large database design techniques; experience in data analysis, data cleansing, and data transformation and migration using ETL tools;
  • Experience in configuration management: Identifying and controlling the release of a product and its changes. Ensuring completeness and consistency of product among components. Managing the process and tools used for builds and making sure every defect has traceability back to the source
  • Extensive Experience in writing SQL Scripts and Unix Scripts
  • Expertise in Data warehousing and Data migration.
  • Proficient in collation/ analysis of business requirements, conducting feasibility study, estimation, development, implementation, testing of deliverable.
  • Optimize resource utilization and manage day to day project activities in compliance to delivery schedules.

SKILLS INCLUDE:

  • Data Governance / Data Quality
  • ETL / Data Integration
  • Data Modelling
  • Data Migration
  • Reporting & Analytics
  • Test Data Management
  • Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, HBase, Zookeeper, Spark, Jira
  • QlikView, Tableau, Cognos
  • Data Stage, Informatica, MS - SSIS
  • Confidential Optim, DataMaker
  • Agile Methodology
  • Cross Functional Team Coordination
  • COBOL, C, PL/SQL, XML, Java, Unix, python, perl,
  • Oracle, SQL Server.Teradata,DB2
  • Information Analyzer, Quality Stage,

EXPERIENCE:

Confidential, Cary, NC

Data Management Lead

Responsibilities:

  • Responsible for ingesting data into Hadoop environments from admin systems and core finance systems.
  • Validating the data from source systems to landing zone, Hadoop cluster and oracle data marts before they consumed by analytics layer.
  • Working with AD, QA in validating transformation rules and migrating the data.
  • Work with data quality teams before pushing the data into staging area and analytics platform.
  • Collaborate with reporting team in making sure the required data available in the oracle data marts for qlikview and Tableau reports.
  • Responsible for Integration efforts of General American Insurance Company (GALIC) merger into Confidential application land scape.
  • Worked with Business, IT and Program management in deriving the data flows from GALIC applications to Confidential to facilitate the data integration.
  • Handled the complex application landscape with over 120+ applications in retail, data warehouses, compensation etc.
  • Worked on data migration using Informatica from admin systems to the data ware house which will be used by Confidential applications.
  • Managed the team of people comprised of on site and offshore for this effort and part of the program command center to work with other teams as part of critical migration.
  • Driving the tasks from the project plan and updating the stakeholders on the progress and best practices for the data migration.
  • Working with key functional teams responsible for the business flow and handled the dependencies on the environments and infrastructure.
  • Working on key solution for data management using informatica, Confidential Optim, iDAq to automate the data provisioning and loading process for application development teams.

Confidential, Greensboro, NC

Data Architect /ETL Architect

Responsibilities:

  • Performed Data ingestion into Data lake from various sources.
  • Worked with enterprise architects in planning for application integration between Confidential and Confidential
  • Responsible for overall data integration efforts including the enhancements to current Confidential EDW design.
  • Delivered Sales Data Mart for daily sales reporting by integrating east data into west systems.
  • Delivered fact primary premiums, originations (dim) on the consolidated data.
  • Planned and enhanced the data ware house design to cater the consolidated design
  • Responsible for data integration efforts from source systems into EDW.
  • Delivered ETL solution for HR as part of separation from Confidential and Integration with Confidential .
  • Led a large data migration effort to move from Legacy to next generation systems.
  • Worked with Horton Works team in setting up Hadoop cluster and worked on ingesting data from legacy systems to data lake.
  • Collaborated with solution delivery team and implemented solution in Information Server which will pick up source files from legacy systems and produce the daily, monthly and quarterly reports for the consumption of business.
  • Collaborated with Data Governance team in building the Data Quality dash board for the business which is based on key indicators of the business areas with completeness, conformity, validate and accuracy dimensions.

Confidential

Engineering Manager/Technical Architect

Responsibilities:

  • Complex queries are written to facilitate the supply of data to other teams.
  • Liaise with business subject matter experts in analyzing business requirements (Productivity related - KPIs), understand the various data sources.
  • Architect the productivity data mart. Review and maintain the schema, its tables, indexes, views and PL/SQL procedures in Oracle 10g.
  • Develop strategies for use in high volume, high performance heterogeneous environment.
  • Map source data elements from various systems like - Confidential, CMS-Customer Management System, Outbound dialers, SBR, and IVR to CCPM and develop, test, and support extraction, transformation and load processes.
  • Analyze source data related to Work force, Call Stats/Productivity, Contact, Contact quality, Training, Hierarchy Final, Coaching. Apply business rules to the data in the raw data mart and load it to the reporting data repository.
  • Defines and captures metadata and rules associated with ETL processes in Information Server.
  • Design and develop the technical/functional specification for ETL development and implement using Information Server. Analyze dependencies for workflow for various ETL processes, handle exception, and maintain logs.
  • Manage artifacts - versions, including software code, documents, design models, and even the directory structure itself.
  • Liaise with business subject matter experts in analyzing business requirements and translating them into detailed conceptual data models, process models, logical models, physical models and schema development in the database.
  • Model & Architect DW design, direct and/or execute the technical characteristics of the overall strategy of the data warehouse and ETL process in Information Server.
  • Architect the database schema and implement dimensional model (star schema). Review and maintain the schema, its tables, indexes, views and PL/SQL procedures in Oracle 10g.
  • Map source system data elements to target system and develop, test, and support extraction, transformation and load processes.
  • Based on business rules, load data, develop Data Stage mappings, create sessions, workflows to load and process data received.
  • Defines and captures metadata and rules associated with ETL processes.
  • Designed the simple and complex data flow for incremental loads of different ETL interfaces.
  • Used multiple stages like SAP R/3 Packs, Sequential File, Pivot, Transformer, Aggregator, Join, Lookup, Sort and Filter during ETL development.
  • Resolved the performance issues while extracting & loading data into or from GWM SAP using ABAP R/3 stage, BAPI R/3 stage.
  • Debug, and resolved the loading failures by verifying the log files.
  • Worked with Data Stage Director in running the solution, testing and debugging its components and monitoring the resulting executable.
  • Involved in creating and maintaining Sequencer jobs.
  • Involved in data cleansing in key process.

Engineering Manager

Confidential

Responsibilities:

  • Liaise with WW program leaders pertaining to implementation of product fix packs and with FVT teams in collation of product feedback and enhancing product quality.
  • Manage WW teams with regards to testing of fix pack release.
  • Interact with customers to understand/ resolve issues with Fix packs or Rollup patches.
  • Manage functional verification testing for connectivity components of Information Server.
  • Define & implement test automation strategy for various components of Information Server.
  • Working with WW program leaders in finalizing the roadmap for the product fix packs and executing them according to the plan.
  • Working with all FVT teams in passing product feedback and improving product quality.
  • Managed WW teams for each fix pack release to ensure all testing is performed.
  • Participating in customer situations to understand the customer issues and addressing them with Fix packs or Rollup patches.
  • Driven the key pain points for customers and brought down the turnaround time for customer issues by 30% and also finalized a fix pack schedule which will help customers proactively.
  • Configured the Info Sphere Information Server tools.
  • Validated the Info Sphere Information Server tools by designing, configuring the jobs, reports, column analysis, features, functionality etc.
  • Direct and/or execute the technical characteristics of the overall strategy of the data warehouse and ETL process.
  • Work closely with IT and the Business group to understand business reporting requirements and analyze logical model, and develop subject matter expertise in a short time.
  • Facilitate Joint Application Development (JAD) sessions for gathering requirements for building a data warehouse and data marts.
  • Participate in development and execution of tactics and strategies to optimize data quality in data warehouse, and OLAP environment.
  • Lead the application development team in providing support & maintenance of data marts and data warehouses. Maintains the integrity of data movement into the data warehouse and other environments
  • Developed Data Stage Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Maps sources system data elements to target systems and develops, tests, and supports extraction, transformation and load processes - define mapping, sessions and workflow

Hire Now