- Accomplished IT Leader with more than 13 (Thirteen) years of extensive experience in Business Management, Information Technology Management and Data Management with both Small and large Fortune 500 companies.
- Experienced in establishing robust Technology Strategies and Roadmaps and best - in-class operational platforms aligned with unique business needs, serving as leader and liaison to global cross-functional teams.
- Strong influencing/negotiation skills, with experience selling concepts and investment to C-level executive.
- Extensive leadership experience within data management, data analytics, data governance, business intelligence, business analytics, and architecture.
- Strong project management skills and demonstrated ability to drive large and complex cross-functional projects.
- Excellent people management skills; planning schedules, coordinate staff and allocate resources to ensure efficiency and productivity are maximized. Mentor junior staff members, developing teams and leaders to ensure that business value is maximized and goals are achieved.
- Highly experienced in developing, implementing and maintaining policies, procedures, standards, guidelines and governance processes that promote the effective and productive use of data in corporate information systems.
- Expert in the overall information architecture design across multiple platform types (operational, analytical, reporting) balancing the need for access against security, compliance, analytics and performance requirements.
- Strong data analytics and business analytics skills. Experienced with Customer Analytics, worked on direct marketing, product selection and pricing strategy and customer relationship management (CRM).
- Experience in Big data technologies like Hadoop, Hive, Impala, SOLR, Sqoop, NoSQL Database. Currently functioning as a technical lead for high visibility big data engineering efforts (Enterprise Data Lake) and responsible for defining all technical aspects of the solution to include the approach, architecture, analyses, technologies and execution.
- Executed multiple end-to-end implementation of Enterprise Data Warehouse, Data Lake, BI strategy roadmaps (current state-future state), including analysis, architecture, data and process models, requirements, design, business rules and metadata for business and IT iterative review.
- Highly experience in both SMP and MPP Databases such as Teradata, Netezza, Oracle, MS SQL Server. Expert in database performance tuning and best practices.
- Strong system Integration architecture experience using Informatica PowerCenter, Custom ETL coding (Stored Procedure, Function), MS SQL Server Integration Services (SSIS).
- Business Intelligence & Data Visualization experience using Business Objects, OBIEE, Tableau, QlikSense, Crystal Report, Essbase.
Confidential, Denver, CO
Sr. Enterprise Architect/ Lead
- Leading global team including Architects, Analysts, ETL and BI Developer across Americas, EMEA and APAC . Responsible for providing overall direction and oversight of the strategy, architecture, governance, development, implementation and administration of Arrow strategic data assets.
- Create a Roadmap to drive the future state of Arrow data platforms including Enterprise Data Lake, Data Warehouse and Data marts. Presented the future state architecture to C-level executive.
- Accountable for managing and overseeing the initiatives of the data management program including prioritizing work, resource planning/resource utilization, setting milestones, reviewing progress, and reporting status to upper management.
- Establish methods and procedures for Data Governance and Master Data Management including tracking data quality, completeness, redundancy, and improvement. Engage Global Process Owners to define master data and process health metrics, agree on targets, and create plans managing toward the targets.
- Part of enterprise Data Governance Committee responsible for the policies, standards, requirements, guidelines, and data definitions. Worked with Information security to place the data security model (authorization & authentication) and also worked with Infrastructure team to have a best in class Infrastructure (Application Servers, Databases etc) setup.
- Leading and overseeing the effort of designing and developing Arrow’s Enterprise Date Lake. Worked with information security team to come up with the Security framework of Data Lake utilizing the Kerberos and Sentry. Set up the guideline, policies, standards and governance model around the data lake.
- Ingested various data types (relational, files, logs and web) using batch (sqoop) or real time into HDFS, Hive/Impala or Hbase for operational reporting and predictive analytics using Qliksense, SOLR and Java Application.
- Partnered with business to work multiple analytical/predictive model projects on Data Lake to determining the global Parts Pricing strategy, best parts price quoting strategy and Planning and backlog data insight.
- Worked with key business stakeholders to facilitate an evolution in terms of information consumption, moving from a printed medium of information, to thoughtfully planned, analytical QlikView dashboards which focus on key performance indicators used to drive the business as provided by business leaders.
- Conducts research into new technologies including tools, components and frameworks ultimately making presentations to the management team and peer groups as requested. The Proof of Concepts (POC's) includes Oracle Big Data Discovery tool, Cloudera Optimizer and Azure Data Catalog for Enterprise Metadata Management/Repository.
- Leading the team of global analyst, architects and developers to design and develop Data Marts such as AR, AP, GL, Sales and Assets for Analytical reporting purposes.
- Led an effort of placing a Teradata Disaster Recovery (DR) solution. Worked with cross function teams of Infrastructure, Networking, Network Security and outside vendors.
- Work closely with the program management office (PMO) to ensure alignment of plans (including migration plans and road maps) with what is being delivered (projects).
- Worked with development team to remove the operational inefficiencies in the current process. Led the effort to reduce the ETL load cycle timing to more than 50% while establishing and maintaining data access, security and integrity controls; participates in quality control audits ensuring standards are maintained.
Environment: Erwin 9.0, Teradata 14.10, Oracle 12c/11g, QlikSense, OBIEE, Golden Gate, Cloudera Hadoop Distribution, Cloudera Optimizer, Informatica Power Center 9.6, MS Visio
Confidential, Denver, CO
- Establish and maintain data management and governance best practices and standards, and ensures consistent application of practices across the organization.
- Defines data/information architecture standards including but not limited to data modeling, data integrity, database standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects. Also developed Data Architecture in support of the Strategy for operational data, business rules management and metadata management.
- Facilitate consistent business analysis, data acquisition and access analysis and design, Database Management Systems optimization, archiving and recovery strategy, load strategy design and implementation, security and change management at the enterprise level.
- Actively facilitate/participate in Business meeting /JAD sessions to finalize the requirement. Interacted heavily with the business user and prepared a Bus Matrix (for reporting needs).
- Designed the Conceptual and Logical model based on the requirements and presented the report and the model to users. Designed the Physical model in compliance with naming standards, generated the DDL and performed the capacity planning for the production database.
- Led the team to design and developed the Enterprise Customer Campaign management for the marketing department. Multiple campaigns were successfully implemented like postal mail campaign, email campaign and Pay-per view events. Complete US house hold data was used to run the campaign, in future plan is to integrate the Weather forecast data for the customer campaign.
- Designed the Product Data Mart to provide the business the analytics on the customer preference for the different channels. The data mart was used to for the product analytics; used for making better business decision such as bundling of the different channels and premium channels offering as well as the Pay-per-view events.
- Led an effort to design and contents of Enterprise Information Management Portal SharePoint that acts as one stop shop for the business users and internal IT user to get complete information (data dictionary, metadata, KPI’s, production alerts, upcoming/current project details etc.) .
- Work with the Data Governance Team to maintain global data definitions used in the data warehouse. Document, maintain and enforce data standards for the Data Warehouse, including security, archiving, backup and recovery.
- Spearhead the effort to migrate the Er/Studio models to Erwin. Responsible for installation/setup of Erwin repository (on VM server). Created policies and standard guidelines for Erwin such as standard template that includes naming standard file and domains, customized model metadata generation/reports (using Crystal Reports), standard template for DDL generation. Also created Erwin Macros for automating DDL generation.
- Supervise the development of Confidential ’s EDW (Teradata) and ODS (Netezza) providing guidance to the ETL developers. Assess data quality requirements in terms of data completeness, consistency, conformity, accuracy and measure data quality, design and suggested data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading to warehouse.
Environment: Erwin 9.0, Embarcadero ER/Studio 7.6.1, Teradata 12/13.10, Netezza 6.0, Oracle 11g, Tableau, OBIEE, Aprimo, Essbase, Informatica Power Center 9.1, JIRA, Golden Gate, BTEQ, Nzsql, MS Visio, PVCS.
Confidential, Westborough, MA
- Actively participate in Business meetings and presentation to understand the requirement and the overall business.
- Designed the logical model for the very complicated CBR core functionality, used the hybrid approach for design, to facilitate the process of update, cancel, “As-Of” transaction.
- Convert the logical model into physical with proper naming standard and best practices. Used Oracle Partitioning to implement the parallel processing. Also setup the Oracle database best practices and data quality process.
- Designed the Semantic layer as Oracle Views/Materialize view and summary tables for the Business Intelligence layer for the reporting purposes. Identifying performance bottlenecks and tuning SQL queries.
Environment: Toad Data Modeler 3.4.1, Oracle 10g, SQL Loader, Toad, Subversion (svn), MS Visio, CA7.
Confidential, Mclean, VA
- Involved in the early phase of the project. Actively participate in Business meeting /JAD sessions to finalize the requirement.
- Developed and improved data modeling and data integrity standards proposed and encourage the new domain for enterprise wide data consistency.
- Designed the logical model based on the requirements and presented the report and the model to customer and internal peer review committee.
- Designed the physical model in compliance with MISMO and Freddiemac naming standards and generated the ddl and did the capacity planning for the production database. Also worked on loading the productionalized data model to the centralize Meta Data Repository (MDR).
- Identify and proposed the security framework to handle the sensitive PPI data in the xml format.
Environment: ER/Studio 7.6.1, Oracle 11g, DB2 UDB, OBIEE, DBArtisan, Rational ClearCase, MS Visio
Confidential, Beaverton, OR
Sr. Data warehouse Developer/ Data Modeler
- Initiate the BC (Business Conception) phase of the project for Need Analysis and to define Business and Functional Specifications. Interacted heavily with the business user and prepared reporting Bus Matrix.
- Architected the Rubicon Mart, did Conceptual Modeling, Logical and Physical Modeling for the same. Translated the high-level design requirements of reliability indicators into Detail design and ETL Mapping specifications. Also created SSIS package and stored procedures to ETL the data from Data Warehouse to Mart.
- Reconciled discrepancies in current state logical and physical data models, and collaborated with other team members to model additional entities and relationships. Resolved changes in logical data model into physical database change requests.
- Being a SME (Subject matter expert) for the Rubicon Datawarehouse/Marts was responsible for the production support and resolved many production issues by putting of hot/quick fixes.
Environment: ER/Studio 7, MS SQL Server 2005, T-SQL, SQL Server Integration Services (SSIS).
Confidential, Charlotte, NC
Data Modeler/ ETL Architect
- Involved in the SD (System Design) Phase of the project, propose the Design of the overall project also prepared the technical design document.
- The Metadata repository contains details about the database, reports and its attributes and also the Clear Case data; in future there is a plan to integrate the ETL metadata and LDAP server data also.
- Design the KB (knowledge base) schema did Conceptual Modeling, Logical and Physical Modeling for the same.
- Developed and improved data modeling and data integrity standards, as well as data architecture policies and procedures. Used Macros and standard abbreviation file for the naming standards in Erwin.
Environment: Erwin 4.1, MS SQL Server 2005, Oracle 10g, Informatica 7.3, PL/SQL, VISIO, UNIX, Clear Case.
Confidential, Grand Rapids, MI
Sr. Data Warehouse Developer
- Architected the Class III ODS (Operational Data Store), did Conceptual Modeling, Logical and Physical Modeling for the same. Also worked on the design of the Staging Schema in order to handle and format (Data cleansing and Data profiling) of unstructured source data.
- The Models are designed with Erwin using subject areas to tailor manageable sub models to the technical and business communities. Created the “as is “model for the current information assets and a "to be" model for future Information Assets.
- Created new procedures, packages, functions and views and modified similar objects to meet changed system requirements. Analyzed code and views for the purpose of making them more efficient and maintainable. Created new procedures and packages to load the Slowly Changing Dimension (SCD) type 1 and type 2 in the ODS tables.
Environment: Erwin 4.1, Oracle 10g/9i, SQL, PL/SQL, SQL*Plus, Sql*Loader, TOAD, MS VISIO, UNIX.
Confidential, Lansing, MI
- Developed, document, and implement standards (Development, QA, and Production environments). Develop and maintain standards for ETL administration and operation including the scheduling, running, and monitoring of all ETL jobs, event logging, management of errors, recover from failures, and validation of outputs to source systems.
- Participate in regular Informatica administration (v 8.1) activities such as taking regular backup of the repository, applying small patches, creating users with different privileges and maintaining the repository.
- Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse with Business Rules. Loading data to staging area (Oracle 10g) using Informatica PowerCenter 8.1, worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
Environment: Informatica Powercenter 8.1, Oracle 10g/9.1/ 8i, PL/SQL, Sql*Loader, Erwin 4.1, TOAD, PL/SQL Developer, Autosys, Windows, MS VISIO, UNIX, Crystal Reports11, Clear Case.
Confidential, Rochester Hills, MI
Data warehouse Developer
- Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse with Business Rules.
- Involved in the Development of informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
- Responsible for managing the Business Objects reporting environment including repository management, report specification, creation and development (report, template, Universes creation), production maintenance, distribution strategy.
- Created the reports using functionalities like Queries, Slice and Dice, Drill Down, Drill By, Drill Through. Introduced the @Aggregate Awareness functionality in the universe.
Environment: Informatica Powercenter 8.1, Business Objects 6.0, Oracle 10g/9.1/ 8i, VISIO, UNIX, Clear Case.