We provide IT Staff Augmentation Services!

Senior Data Management Resume

3.00/5 (Submit Your Rating)

Livingston, NJ

PROFESSIONAL SUMMARY

  • Results oriented technology professional that specializes in Enterprise Data, Integration and Solution Architecture, Data Modeling and BI
  • Diverse hands - on experience leading Strategic and Tactical efforts of Enterprise Data Architecture, Integration, Road Mapping, Current State Analysis through Reverse Engineering, Future State Creation and Modeling experience within ER (Transactional(3NF), ODS) and Dimensional Models for Warehouse, Cubes, Hadoop Eco-System, Data Lakes and BI applications.
  • Successful Data Architectural leadership and delivery track record of the following practices:
  • Enterprise Data Warehouse / Data Marts / Data Hubs / Data Integration / ETL / Virtualization
  • Marketing Data and Campaign Management/ Product Information Management /BI / Data Visualization/Metadata Management
  • Big Data / Data Lake / Hadoop HDFS Integration, AWS Cloud

TECHNOLOGY SKILLS

  • AWS CloudAWS Architecture Best Practices, AWS Security (IAM, Subnets, Security Groups, Policies), AWS S3, AWS Glacier, AWS RedShift, AWS RDS, AWS EC2, AWS VPC, AWS SQS, AWS SNS, DNS and Route 53
  • ArchitectureData Architecture Design Patterns (Kimball Data Warehouse, OLAP, OLTP), Big Data, Integration Architecture Design Patterns, Domain Design Modeling, Dimensional Modeling, ER Modeling, Design/Development Best Practices (Scalability, Reusability and Efficiency), BI Architecture, Database Optimization
  • Hadoop/Big Data Hadoop HDFS 1.0, 2.*, Ambari, HUE, Pig, Hive, HQL, HBASE, Impala, Sqoop, Flume, NoSql, Avro, Parquet, Oozie, Cloudera CDH, Horton Works Sandbox
  • SoftwareER/Studio Data Architect 10.0, Attacama Data Quality, Denodo Data Virtualization, Collibra, OBIEE, Tableau, Informatica PowerCenter(7.1 thru 10), Power Exchange, B2B Data Transformation, AUTOSYS, CONTROL-M, TOAD, SQL Developer, JCBC, ODBC, API’s
  • Database/FilesAWS S3, AWS RedShift, AWS RDS, HBASE, NoSQL, Hive, Avro, Parquet, XML, JSON, csv, txt, ORACLE 9i thru 12c, DB2 LUW v7 thru v10.5, SQL Server
  • Pgm/Scripting HQL, PIG, SQOOP, FLUME, PL/SQL (Oracle), UNIX Scripting, SQL, SQL-PL (db2), T-SQL (MS)
  • ETL Integration Architecture Design Patterns, Dimensional Modeling, Pipeline Partitions, High Availability/GRID, Session and Workflow log review and debugging, Performance Tuning at Mapping, Session and Workflow levels, Caching, Worklets, Mapplets, Expression functions, Source Qualifier filtering, Query Overrides, XML source/targets, CDC, SCD Types, Parm files

PROFESSIONAL EXPERIENCE

Confidential

Senior Data Management

Responsibilities:

  • Provide Data Management and Architecture Consultancy Services in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
  • Deliver high quality Conceptual, Logical and Physical data models through collaboration with business stakeholders and technologists for Enterprise Data Warehouse Redesign
  • Collaborate with client to ensure correct requirement capture and successful outcomes of all assigned efforts
  • Design HDFS directory structure creating directory paths, partitioning and Hive Tables
  • Design of Conceptual and Logical models of Hive tables to facilitate various business subject areas/data
  • Collaborate with client to ensure correct requirement capture and successful outcomes of all assigned efforts

Confidential, Livingston, NJ

Title: Enterprise Data and Integration Solution Architect

Responsibilities:

  • Provide Data Management and Architecture leadership in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
  • Strategically aligned with Data Governance Office, Business Consumers, Stakeholders to drive Data Stewardship, Standards, Cross-reference and Meta Data Management
  • Deliver high quality data models through collaboration with business stakeholders, integration technologists ensuring early stakeholder involvement, requirements capture and utilization value and satisfaction
  • Drive and Lead Data Architecture solutions within Virtualization, Data Warehouse, Data Hubs, Data Marts and Business Intelligence layers within a multi-faceted technology landscape (Oracle, MicroSoft, Denodo, Informatica, etc…)
  • Design Corporate SOX Compliance Data and Integration Architecture ensuring successful captures, audit and reporting of data conforming to correctness, validity and accuracy as well as efficient data structures

Confidential, New Jersey

Enterprise Data Management

Responsibilities:

  • Provide Data Management and Architecture leadership in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
  • Developed, through collaboration with GAF Product Managers and Digital Marketing Manager, the Enterprise Marketing Product Hierarchy, GAF Product Master Architecture with all associated attributes for all GAF manufactured products for newly redesigned GAF corporate website utilizing SITECORE. This effort allows the seamless data relationships to all associated products as well as identified associated suggested products for contractors, consumers and down - stream data consumers (mobile apps, etc...).
  • Designed and Developed the JSON data structures to deliver data to SITECORE(Web) for ingestion to NoSql database(Key/Value)
  • Collaboration with GAF Data Scientist to deliver high quality Marketing BI Data models (Logical/Physical) and insights to marketing department through analysis, design and architecture of 3rd party vendor data (XML, OLTP systems) such as Eloqua, MMC, Oxford Economics, Dodge Market Data) consumed by marketing through Tableau
  • Design and Architect data structures and integration landscape for 3rd Party Marketing data (i.e. Eloqua, Dodge, etc ) from source through Enterprise Data Warehouse and Business Intelligence Reporting for GAF Marketing Professionals

Confidential

Solution Architect-Enterprise Information Management

Responsibilities:

  • Responsible for closing $1.8 million financial gap, entailing expired authorizations and non-payment from Insurance Payers. Proactive authorization data capture and reporting allowed business to identify and act upon near future expirations to initiate timely authorization requests
  • Partner with Revenue Cycle and Manage Care Business Unit Directors and team members to effectively understand, plan and act on their challenges, desires and needs concerning enterprise data, metrics and use cases
  • Assigned as Primary IT resource leading a team of data management professionals, defining technology roadmap and conversion from current architecture to BI centric data architecture (Dimensional Modeling, Integration, Scheduling) and reporting (OBIEE)
  • Effectively perform Data Profiling and Data Quality exercises to identify enterprise data assets, anomalies, conformity gaps as well as solutions to various business questions.
  • Creation of conceptual, logical and physical data models (dimensional) thus ensuring core dimensional table reusability across various business unit DataMart’s as well as providing distinct business unit Fact tables leveraged with valued metrics and measures

Confidential

Big Data Integration Solution Architect/Engineer

Responsibilities:

  • Architected, Designed, Developed and Implemented Big Data Architecture and Integration Applications, creating a HealthCare Data Lake leveraged within the Hadoop ecosystem along with full MDM and Analytics Solution (GreenPlum and Tableau) for Member, Provider and Pharmacy Claims data
  • Architected HDFS Storage, Data Directory Structures, HIVE and HBASE Schema design
  • Architected MDM outbound data layer consumption process (xml) to load Healthcare Data lake leveraged within the Hadoop ecosystem(HIVE tables- Landing, Raw and Refined zone tables)
  • Created ETL mappings to extract from Healthcare Data Lake (Hadoop) to bulk load GREENPLUM database utilizing GP LOADER
  • Perform RDBMS data import and extraction building and executing Sqoop scripts, to and from HDFS
  • Architected HDFS Storage, Data Directory Structures, HIVE and HBASE Schema design for Landing, Raw and Refined zone tables

Confidential, Roseland, NJ

Data Integration Solution Architect/Engineer

Responsibilities:

  • Architect and Modeled Star Schema Data Marts with Metric based Fact tables, which consist of various SCD type dimensions
  • Architect, Design, Develop and Delivered leading a team of Integration Engineers, over 100+ Informatica ETL workflows(History, CDC, SCD), DB2 stage, dimension and fact tables, Metric Based Fact Load Stored Procedures, views, functions and associated Unix scripts thus ensuring development best practices, scalability, reusability, optimal performance for all objects
  • Provide expert-level consultation and documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.
  • Effectively collaborate with COGNOS Reporting Development Team to address data quality, process burdens, bottlenecks and performance issues concerning data retrieval
  • Increased project role responsibilities by redesigning, developing and optimizing DB2 VIEWS utilized by COGNOS Reports to improve undesirable view runtime processing (ex. query tuning, usage of Global Temp tables, pre-stage view data)

Confidential

Data Architect

Responsibilities:

  • Provided Data Architecture for Credit Risk group to create new business unit Data Marts (Star Schema) to remove business analytic resource from spreadsheet utilization
  • Designed, Modeled Ani-Money Laundering Data Model to conform to AML data usage/use cases established by AML Vendors and Business Unit Staff. Worked closely on requirements gathering as well as metadata repository creation.
  • Successfully architected, modeled and developed initial project customer based Denodo views sourcing from four disparate source systems, thus creating a unified Enterprise customer view

Confidential, New York, NY

Senior Informatica ETL Developer Lead

Responsibilities:

  • Senior Informatica ETL Development Lead for Financial Capital Markets, Primary Brokerage and Derivatives organization’s mission critical BI Data Warehouse project utilizing Informatica Power Center 8.6.1 ensuring peak performance and optimization of ETL’s and Oracle database objects
  • Re-engineered and Delivered an existing portfolio of Informatica mappings and workflows, which were in need of process corrections and optimization. Enhancements entailed appropriate source filtering, joiner, lookup transformation corrections ensuring efficient use of applicable transformations thus removing session revealed bottlenecks
  • Provide expert-level consultation and documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.
  • Effectively support of daily and weekly trade/activity/fails reporting to the Federal Reserve Bank of NY generated by INFORMATICA mappings and workflow loads to Oracle Database and presented by OBIEE

Confidential, Hackensack, NJ

Senior Informatica ETL Developer

Responsibilities:

  • Senior Informatica ETL Developer successfully delivering quality solutions for S&P Data Warehouse/BI through client facing requirements gathering, design, development, testing and support of numerous ETL deliverables utilizing Informatica Power Center 8.6.1, Unix scripts and PL/SQL procedures, packages and functions
  • Provide expert-level consultation by optimizing existing Informatica mapping and workflows, create documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.

Confidential, Albany, NY

Senior Technical Engineer/ Analyst

Responsibilities:

  • Senior Informatica ETL Developer for NYSDTF Taxation System Data Warehouse
  • Design, code, test and implementation of Informatica Mappings, Mapplets, Workflows utilizing Power Center 8.6
  • Senior Informatica ETL Developer for organization-wide mission-critical Transformation project, which entailed EDS(OLTP), Data Mart and Data Warehouse(OLAP) ETL creation, implementation, maintenance. and support
  • Assigned responsibility for high quality software solutions, which entails design, code, test and implementation of Informatica Mappings, Mapplets, Worklets and Workflows utilizing Power Center (v. 7.1 thru 8.6 utilized) retiring legacy systems (i.e. claims, provider payments)
  • Recipient of 2008 Top IT Performer Award resulting from creating and managing high quality levels of software applications, above and beyond efforts and service levels, team building, performance, dedication and mentoring

Confidential, Albany, NY

Big Data Experience

Responsibilities:

  • Architected, Designed, Developed and Implemented Big Data Architecture and Integration Applications, creating a HealthCare Data Lake leveraged within the Hadoop ecosystem along with full MDM and Analytics Solution (GreenPlum and Tableau) for Member, Provider and Pharmacy Claims data
  • Architected HDFS Storage, Data Directory Structures, HIVE and HBASE Schema design
  • Architected MDM outbound data layer consumption process (xml) to load Healthcare Data lake leveraged within the Hadoop ecosystem (HIVE tables- Landing, Raw and Refined zone tables)
  • Perform RDBMS data import and extractions, building and executing Sqoop scripts, to and from HDFS
  • Experienced with PIG Interactive and Script modes within the GRUNT Shell for development and testing as well as authoring PIG scripts (LOAD, STORE, DUMP, parameterization, processing relations, maps, tuples) and GRUNT shell and File System commands
  • Experienced architecture, developing and navigating within Hive environment, creating HIVE tables, writing HQL to perform query analysis of HDFS files within the eco-system
  • Experience utilizing and architecture of NoSQL DB’s (Key/Value, Document(RavenDB) and Big Table)

We'd love your feedback!