We provide IT Staff Augmentation Services!

Bi Architect Resume

2.00/5 (Submit Your Rating)

New York City, NY

SUMMARY

  • I have 18 years of IT experience in ETL and Data Warehouse with specialization in Pentaho implementations.
  • Integration of Complicated Enterprise level Dashboards.
  • Bridging gap between Business owners/drivers and technical teams
  • Identifying bottlenecks in processes, tools and technologies.
  • Optimizing resource utilization in OS, ETL, Databases, Reporting & BI and Batch window availability.
  • Integration of multiple sources and targets using Pentaho.
  • Performance Testing, Grid Optimization and Performance Benchmarking.
  • Dynamic SQL scripting and data generation, Data partitioning and archival.

PROFESSIONAL EXPERIENCE

Confidential, New York City, NY

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis for the new Clinical Cloud database
  • Creating an optimal Data Model with proper operational hierarchy.
  • Generating dynamic ETL approach to have a scalable solution when the data model explodes, new hierarchies are added.
  • Creating scalable ETL processes from sources in Oracle, XML, APIs, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: AWS, Oracle, XML, APIs, Pentaho ETL

Confidential, New York City, NY

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis.
  • Creating an optimal Data Model with proper naming conventions to facilitate automated validation.
  • Creating scalable ETL processes from sources in DREMEL, BIGQUERY, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Optimizing ETL performance and generating high performance extracts to be consumed in Tableau.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: AWS Redshift, Salesforce, Gainsight, Pentaho ETL, Tableau

Confidential, Mountain View, CA

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis.
  • Creating an optimal Data Model with proper naming conventions to facilitate automated validation.
  • Creating scalable ETL processes from sources in DREMEL, BIGQUERY, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Optimizing ETL performance and generating high performance extracts to be consumed in Tableau.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: Dremel, Big Query, Pentaho ETL pantheon, Tableau

Confidential, New York, NY

ETL Architect

Responsibilities:

  • Requirement Gathering from the Analytics team and Centers for Medicare & Medicaid Services (CMS)
  • Data warehouse Modeling to create source data model and Star schema for Data warehouse.
  • Creating ETL architecture for large data volume using Pentaho Data Integration
  • Used Fast Json, Java, Mysql scripts to retrieve source data.

Environment: Pentaho BI Suite, mysql, MongoDB

Confidential, New York, NY

Data Warehouse Architect/ETL Engineer

Responsibilities:

  • Requirement Gathering from the Business Users in Advertising and Reporting
  • Defining Source - target mappings for ETL in Pentaho, dependency matrix for ETLs and Reporting.
  • Incorporating reconciliation using Confidential Analytics and source target data profiling.
  • Dynamic ETL in Pentaho using Sugar CRM metadata to keep pace with the numerous changes on the CRM Analytics and ensuing that the ETL development is no longer a bottleneck for business users.
  • Incorporating data from Redshift and web logs in Hadoop.
  • Scripting in PIG to create rollups of weblogs for feeding data into data-warehouse.

Environment: Pentaho BI Suite, Redshift, Oracle, Confidential Analytics.

Confidential

Product Technical Architect

Responsibilities:

  • Source System Analysis the Source systems was mainly oracle data bases along with Flat-Files or XML feeds from Non-Finacle Systems. The systems were analyzed to bring all data sources to a de-normalized format to be processed in a generic format that can be loaded to target Dimensions and Facts using a Pre-staging and Staging Database. Source-Target Mappings for 200 targets.
  • Target Data Warehouse Design the Data warehouse was designed using Ralph Kimball Star Schema with minor snow-flaking and Global Dimensions. The data model was created in Erwin and tables were optimized for downstream reporting.
  • Reporting Design Designing 50 off the shelf reports as a product. These reports were created in Tableau format for easier deployment.
  • Performance benchmarking Using bulk generation of data all sources were populated and product tested for performance using Confidential Labs and analyzing Oracle AWR and nmon reports
  • A key feature of this product was Dynamic ETL using Product metadata to create ETL and all objects dynamically using semantic layers.
  • Presenting Solutions to Clients and incorporating new features.

Environment: Confidential Infosphere Information Server, Oracle, Tableau

Confidential

Data Warehouse Architect

Responsibilities:

  • Source System Analysis The Source systems consisted of many relational databases, equipment generated log files, XML files. Typically any equipment connected to the network creates at least 2-5 types of data having different metadata. The systems were analyzed to bring all data sources to a pre-staging database.
  • Target Data Warehouse Design the Data warehouse was designed using Ralph Kimball Star Schema with major snow-flaking and Global Dimensions. The Tables were optimized for downstream reporting.
  • Designing ETL Architecture Implementing the Source-Target Mapping using Transformations created in Pentaho, PERL & Python. Implementing Dependencies, SCD, Partitioning using ETL and Oracle
  • Reporting Design Designing 100 off the shelf reports as a product in Tableau.
  • Shell Scripting to control ETL Steps and Reporting Refresh, monitoring and feedback to support teams.
  • KPI incorporation The telecom domain has 1000s of Key Performance Indicators and all of them cannot be shipped as a part of the product. An Application interface was created to provide drag and drop framework which enabled KPIs to be directly created as ETL components. This enabled in-memory Reporting from Tableau to directly connect to Data warehouse bypassing the Cognos Framework.
  • A key feature of this product was Excel Source-Target Mapping being directly used to create ETL Source Target Mappings using semantic layers.
  • Big Data processing of Network logs using PIG & HIVE
  • Presenting Product Demo to Clients and incorporating new features.

Environment: Confidential Infosphere Information Server, Oracle, Tableau

Confidential

Data Warehouse Architect

Responsibilities:

  • Datamart Design A couple of datamarts were designed as Star Schema and optimized for ETL and Reporting
  • Dynamic SQLs generation process used Metadata directly from ETL requirements. This was optimized to reduce the development efforts considerably.

Environment: Microsoft SSIS/SSRS Unix, Sqlserver, Erwin

Confidential

Data Warehouse Architect

Responsibilities:

  • Requirement Analysis Analyze site analytics and performance data. Defining site performance metrics. Comparing multiple ETL and reporting tools.
  • Data Warehouse Design Creating Data Warehouse Star Schema Model and OLAP Cubes for Reporting
  • ETL Design Creating POC ETL components in Pentaho
  • OLAP Cubes Design Creating OLAP cubes and Refresh Strategy
  • Tableau Reports and Dashboard Design In-Memory Reporting were used to Design 20+ Reports and 5 Dashboard including Reports Bursting and Email delivery

Environment: Pentaho BI, Tableau

Confidential, Houston, TX

ETL Architect

Responsibilities:

  • Requirement Analysis and Database design Reconciliation requirements were translated into Database design using Data from Flat files and ERCOT Oracle databases Most of the Reporting Requirements were re-conciliation based data from Retail usage and ERCOT.
  • Two datamarts were created with 40-50 tables and around 50 PL/SQL packages.
  • Packages were tuned re-cursively when data volumes increased to more than 10 million records/day.
  • Datastage ETL was used to connect to flat files and secondary Data-sources

Environment: Erwin, Confidential information server, Oracle, Unix

Confidential

Oracle DBA

Responsibilities:

  • Database Installation and Configuration for Manugistics
  • Database Support Post installation, monitoring and single point reconciliation of all databases, troubleshooting issues using Enterprise Manager, Statspack, AWR reports
  • Optimizing Databases Processes like Indexing, Partitioning, Defragmentation, SQLs tuning, DB tuning.
  • Critical Production Support

Environment: Oracle, pl/sql, Enterprise manager, Unix

We'd love your feedback!