Senior Data Management Resume
SUMMARY
- Results oriented technology professional that specializes in Enterprise Data, Integration and Solution Architecture, Data Modeling and BI
- Diverse “hands - on” experience leading Strategic and Tactical efforts of Enterprise Data Architecture, Integration, Road Mapping, Current State Analysis through Reverse Engineering, Future State Creation and Modeling experience within ER (Transactional(3NF), ODS) and Dimensional Models for Warehouse, Cubes, Hadoop Eco-System, Data Lakes and BI applications.
TECHNICAL SKILLS
AWS Cloud: AWS Architecture Best Practices, AWS Security (IAM, Subnets, Security Groups, Policies), AWS S3, AWS Glacier, AWS RedShift, AWS RDS, AWS EC2, AWS VPC, AWS SQS, AWS SNS, DNS and Route 53
Architecture: Data Architecture Design Patterns (Kimball Data Warehouse, OLAP, OLTP), Big Data, Integration Architecture Design Patterns, Domain Design Modeling, Dimensional Modeling, ER Modeling, Design/Development Best Practices (Scalability, Reusability and Efficiency), BI Architecture, Database Optimization
Hadoop/Big Data: Hadoop HDFS 1.0, 2.*, Ambari, HUE, Pig, Hive, HQL, HBASE, Impala, Sqoop, Flume, NoSql, Avro, Parquet, Oozie, Cloudera CDH, Horton Works Sandbox
Software: ER/Studio Data Architect 10.0, Attacama Data Quality, Denodo Data Virtualization, Collibra, OBIEE, Tableau, Informatica PowerCenter(7.1 thru 10), Power Exchange, B2B Data Transformation, AUTOSYS, CONTROL-M, TOAD, SQL Developer, JCBC, ODBC, API’s
Database/Files: AWS S3, AWS RedShift, AWS RDS, HBASE, NoSQL, Hive, Avro, Parquet, XML, JSON, csv, txt, ORACLE 9i thru 12c, DB2 LUW v7 thru v10.5, SQL Server
Pgm/Scripting: HQL, PIG, SQOOP, FLUME, PL/SQL (Oracle), UNIX Scripting, SQL, SQL-PL (db2), T-SQL (MS)
ETL Integration: Architecture Design Patterns, Dimensional Modeling, Pipeline Partitions, High Availability/GRID, Session and Workflow log review and debugging, Performance Tuning at Mapping, Session and Workflow levels, Caching, Worklets, Mapplets, Expression functions, Source Qualifier filtering, Query Overrides, XML source/targets, CDC, SCD Types, Parm files
PROFESSIONAL EXPERIENCE
Confidential
Senior Data Management
Responsibilities:
- Provide Data Management and Architecture Consultancy Services in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
- Deliver high quality Conceptual, Logical and Physical data models through collaboration with business stakeholders and technologists for Enterprise Data Warehouse Redesign
- Collaborate with client to ensure correct requirement capture and successful outcomes of all assigned efforts
Confidential
Data Architect Consultant
Responsibilities:
- Design HDFS directory structure creating directory paths, partitioning and Hive Tables
- Design of Conceptual and Logical models of Hive tables to facilitate various business subject areas/data
- Collaborate with client to ensure correct requirement capture and successful outcomes of all assigned efforts
Confidential, Livingston, NJ
Enterprise Data and Integration Solution Architect
Responsibilities:
- Provide Data Management and Architecture leadership in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
- Strategically aligned with Data Governance Office, Business Consumers, Stakeholders to drive Data Stewardship, Standards, Cross-reference and Meta Data Management
- Deliver high quality data models through collaboration with business stakeholders, integration technologists ensuring early stakeholder involvement, requirements capture and utilization value and satisfaction
- Drive and Lead Data Architecture solutions within Virtualization, Data Warehouse, Data Hubs, Data Marts and Business Intelligence layers within a multi-faceted technology landscape (Oracle, MicroSoft, Denodo, Informatica, etc…)
- Design Corporate SOX Compliance Data and Integration Architecture ensuring successful captures, audit and reporting of data conforming to correctness, validity and accuracy as well as efficient data structures
Confidential - Parsippany, New Jersey
Enterprise Data & Solution Architect
Responsibilities:
- Provide Data Management and Architecture leadership in the development of strategic roadmap, data standards, data quality, data security, governance, integration and architectural design of structured and unstructured enterprise assets
- Developed, through collaboration with Confidential Product Managers and Digital Marketing Manager, the Enterprise Marketing Product Hierarchy, Confidential Product Master Architecture with all associated attributes for all Confidential manufactured products for newly redesigned Confidential corporate website utilizing SITECORE. This effort allows the seamless data relationships to all associated products as well as identified associated suggested products for contractors, consumers and down - stream data consumers (mobile apps, etc...).
- Designed and Developed the JSON data structures to deliver data to SITECORE(Web) for ingestion to NoSql database(Key/Value)
- Collaboration with Confidential Data Scientist to deliver high quality Marketing BI Data models (Logical/Physical) and insights to marketing department through analysis, design and architecture of 3rd party vendor data (XML, OLTP systems) such as Eloqua, MMC, Oxford Economics, Dodge Market Data) consumed by marketing through Tableau
- Design and Architect data structures and integration landscape for 3rd Party Marketing data (i.e. Eloqua, Dodge, etc ) from source through Enterprise Data Warehouse and Business Intelligence Reporting for Confidential Marketing Professionals
Confidential
Enterprise Data & Solution Architect
Responsibilities:
- Responsible for closing $1.8 million financial gap, entailing expired authorizations and non-payment from Insurance Payers. Proactive authorization data capture and reporting allowed business to identify and act upon near future expirations to initiate timely authorization requests
- Partner with Revenue Cycle and Manage Care Business Unit Directors and team members to effectively understand, plan and act on their challenges, desires and needs concerning enterprise data, metrics and use cases
- Assigned as Primary IT resource leading a team of data management professionals, defining technology roadmap and conversion from current architecture to BI centric data architecture (Dimensional Modeling, Integration, Scheduling) and reporting (OBIEE)
- Effectively perform Data Profiling and Data Quality exercises to identify enterprise data assets, anomalies, conformity gaps as well as solutions to various business questions.
- Creation of conceptual, logical and physical data models (dimensional) thus ensuring core dimensional table reusability across various business unit DataMart’s as well as providing distinct business unit Fact tables leveraged with valued metrics and measures
Confidential
Big Data Integration Solution Architect/Engineer
Responsibilities:
- Architected, Designed, Developed and Implemented Big Data Architecture and Integration Applications, creating a HealthCare Data Lake leveraged within the Hadoop ecosystem along with full MDM and Analytics Solution (GreenPlum and Tableau) for Member, Provider and Pharmacy Claims data
- Architected HDFS Storage, Data Directory Structures, HIVE and HBASE Schema design
- Architected MDM outbound data layer consumption process (xml) to load Healthcare Data lake leveraged within the Hadoop ecosystem(HIVE tables- Landing, Raw and Refined zone tables)
- Created ETL mappings to extract from Healthcare Data Lake (Hadoop) to bulk load GREENPLUM database utilizing GP LOADER
- Perform RDBMS data import and extraction building and executing Sqoop scripts, to and from HDFS
- Architected HDFS Storage, Data Directory Structures, HIVE and HBASE Schema design for Landing, Raw and Refined zone tables
Confidential
Data Integration Solution Architect/Engineer
Responsibilities:
- Architect and Modeled Star Schema Data Marts with Metric based Fact tables, which consist of various SCD type dimensions
- Architect, Design, Develop and Delivered leading a team of Integration Engineers, over 100+ Informatica ETL workflows(History, CDC, SCD), DB2 stage, dimension and fact tables, Metric Based Fact Load Stored Procedures, views, functions and associated Unix scripts thus ensuring development best practices, scalability, reusability, optimal performance for all objects
- Provide expert-level consultation and documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.
- Effectively collaborate with COGNOS Reporting Development Team to address data quality, process burdens, bottlenecks and performance issues concerning data retrieval
- Increased project role responsibilities by redesigning, developing and optimizing DB2 VIEWS utilized by COGNOS Reports to improve undesirable view runtime processing (ex. query tuning, usage of Global Temp tables, pre-stage view data)
Confidential, Livingston, NJ
Asst. VP- IT Corporate Technologies Data Architect
Responsibilities:
- Provided Data Architecture for Credit Risk group to create new business unit Data Marts (Star Schema) to remove business analytic resource from spreadsheet utilization
- Designed, Modeled Ani-Money Laundering Data Model to conform to AML data usage/use cases established by AML Vendors and Business Unit Staff. Worked closely on requirements gathering as well as metadata repository creation.
- Successfully architected, modeled and developed initial project customer based Denodo views sourcing from four disparate source systems, thus creating a unified Enterprise customer view
Confidential, New York, NY
Senior Informatica ETL Developer Lead
Responsibilities:
- Senior Informatica ETL Development Lead for Financial Capital Markets, Primary Brokerage and Derivatives organization’s mission critical BI Data Warehouse project utilizing Informatica Power Center 8.6.1 ensuring peak performance and optimization of ETL’s and Oracle database objects
- Re-engineered and Delivered an existing portfolio of Informatica mappings and workflows, which were in need of process corrections and optimization. Enhancements entailed appropriate source filtering, joiner, lookup transformation corrections ensuring efficient use of applicable transformations thus removing session revealed bottlenecks
- Provide expert-level consultation and documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.
- Effectively support of daily and weekly trade/activity/fails reporting to the Federal Reserve Bank of NY generated by INFORMATICA mappings and workflow loads to Oracle Database and presented by OBIEE
Confidential, Hackensack, NJ
Senior Informatica ETL Developer
Responsibilities:
- Senior Informatica ETL Developer successfully delivering quality solutions for S&P Data Warehouse/BI through client facing requirements gathering, design, development, testing and support of numerous ETL deliverables utilizing Informatica Power Center 8.6.1, Unix scripts and PL/SQL procedures, packages and functions
- Provide expert-level consultation by optimizing existing Informatica mapping and workflows, create documentation of ETL standards, Informatica PowerCenter tool best practices concerning various transformations, mapping designs, sessions and workflows.