We provide IT Staff Augmentation Services!

Azure Data Architect Resume

3.67/5 (Submit Your Rating)

SUMMARY

  • A seasoned Azure Data / DWH/ Analytics Architect with close to 17 years of experience in IT, with strong capability to provide high quality Cloud/Data/BI/ETL Solutions, Data Strategies and Architecture across varied Technology Landscape and helping organization to maximize their utilizations of their existing data investments.
  • Delivered several Data Strategy Implementations across several client verticals that is high in performance throughput / data availability, time and cost efficient and scalable.
  • Expertise in Data Transformation, Data Governance, Data Management, Data Analytics and Data Visualization using various Data Tools and frameworks.
  • Facilitated several organizations through their digital transformation journey by helping them migrate from On - Prem to Cloud solutions using MS Azure Stack.
  • Acted as a Trusted Liaison between Business Stakeholders and IT members to help deliver many BI projects.
  • Worked across Microsoft Azure Data Lake, Databricks, Data Factory, Cognitive Search, ASQL Database.
  • Extensively worked with BI tools such as MS Power BI, Tableau, BI/BW/HANA/Business Objects, Crystal Reports, Web, MicroStrategy, and ETL tools such as SAP BO Data Services, Alteryx and Informatica.
  • Designed POV Prototypes for Enterprise search pipelines using HealthCare data with Elasticsearch, Logstash and Kibana (ELK Stack) setup on AWS and GCP clusters.
  • Exhibited seasoned judgment to arbitrate approaches to solving customer needs, business requirements and technical pragmatism.
  • Provided several Training Sessions and Demo Workshops for End Users / Power Users
  • Extensively worked with companies involving all Agile Methodologies.
  • Provided Technical Leadership, Project & Product Management guidance to team members.
  • Presented Data Strategy and Future RoadMap’s to C-level Executives and Technical Audience

TECHNICAL SKILLS

Database(s): SAP HANA 1.0 SPS 09, SAP BW 7.4/BW 3.5, ORACLE 7.3/8/8i/9i/10g/11i, Sybase ASE 12.5, DB2 UDB, MS SQL Server 2012/2019, Teradata 13/14, Postgres, Hyperion Essbase, Sybase IQ 16

Database Admin Tools: Oracle DBA Studio, TOAD, SQL * Plus, MS SQL Server 2012/2008 SSMS, Teradata SQL Assistant

ETL Tools: Informatica Power Center 6.2, SAP Business Objects Data Services 4.2, Discovery Hub

Data modeling Tool: Star Schema, Snowflake Schema Modeling, ERWIN 6.0, HANA Modeler 1.0 SPS 09

Reporting Tools: BI 4.1 SP2 (Webi RC), BOBJ 4 Mobi 4.x, Business Objects Explorer, Business Objects XI R3.1 WebI InfoView, Crystal Reports 2016(BI 4.2)/ 2008 (XI R3.1), Actuate, Domo, SiSense, MS PowerBI Desktop Desktop, QlikView 11, Tableau 10.1 Desktop / Server, Hyperion Interactive Reporting, Microstrategy 9.4.1 Architect / Desktop / Web

Big Data: Apache Hadoop, HDFS, Hive QL, MapReduce, HBase, ELK Stack

Programming Languages: C, C++, XML, Oracle PL/SQL, MS SQL Server T-SQL, UNIX Shell Script, SAS and R

Cloud Platforms: MS Azure DataLake, DataFactory, DataBricks, Cognitive Search, ASQL Database.

PROFESSIONAL EXPERIENCE

Confidential

Azure Data Architect

Responsibilities:

  • Continually had interactive Data Working Sessions with Confidential Technical Team and the HES Business Stakeholders with whiteboarding sessions to gather requirements.
  • Laid down the Azure foundational framework in the HES Data Lake across the folders such as Produced (consumption), Refined (write back of Delta Files)
  • Architected the Azure Solution framework leveraging ADB, ASQL, ACS and ADF for implementing the Confidential HES Platform CSS Tool across of several disparate Data Sources including Stature, Impact, BowTie, VnV Assurance and ADS Equipment.
  • Harmonized the data from the HES Data Lake for Stature, Impact, BowTie, VnV Assurance and ADS Equipment via Azure Databricks Notebook using PySpark /PY/SQL interchangeably.
  • Setup the Delta Lake within Azure Databricks to write only the incremental delta records into the Refined folder of the HES Data Lake to be consume for the Azure Cognitive Search (ACS) Canonical Model.
  • Setup several Master / Child Azure Databricks Notebooks for performance-efficient writeback of JSON files to feed into the ACS Canonical Model upon which Indexes, Facetable, Filterable, Sortable, Searchable options were setup.
  • Developed the Azure Data Factory Master / Child pipelines for orchestrating the data transformation activities across Stature Risk Studies, Safeguards, Impact- Incidents / Investigations, BowTie -Risk Studies, VnV Assurance - Verifications and Validations and ADS Equipment to eventually load into Azure SQL Database.
  • Setup the Go-Live Run Book for the Daily, Weekly and Ad-Hoc orchestration of Azure Data Factory pipelines and Azure Databricks Notebooks.
  • Conducted several iterative sessions across QA, UAT to obtain sign-off and to code push across Data, API and UI Teams prior to migrating into Production.

Confidential

BI Architect

Responsibilities:

  • Interacted with Business Stakeholders with whiteboarding sessions to gather requirements.
  • Designed and Developed Universes using within IDT 4.2 against MS SQL Server DB tables.
  • Built SSAS (Tabular) Models and deployed onto Azure Analysis Services and setup Security.
  • Built complex metrics using DAX calculations and KPI validations using SQL.
  • Designed and Developed dashboards and analytics using MS Power BI Desktop/Service sourcing data from Azure Analysis Service Tabular Models.
  • Ingested data from SAP ECC 7.4 via Azure Data Factory pipelines using switch/case and for each loop activity and loaded into Azure Data Lake.
  • Setup the incremental to merge separate blob partition files using delta within Azure Databricks notebook.
  • Performed data transformations using pySpark and Spark SQL within Azure Databricks notebook and setup loads into target.
  • Did performance tuning and optimization of Databricks notebooks / clusters by tweaking python code.

Confidential

BI Architect/Developer

Responsibilities:

  • Interacted with Business Stakeholders with interactive working sessions to gather requirements.
  • Ingested data from Health Networks client portal data extract into CSV via Azure Data Factory into Azure Data Lake first and then into Azure SQL Database.
  • Setup the delta incremental loads within Azure Data Factory to load into target Azure SQL DB
  • Built the Azure Analysis Services Models using the tables from Azure SQL DB
  • Developed dashboards using MS Power BI Desktop with complex DAX calculations using data from Azure Analysis Services Models.
  • Architected, Designed and Developed solutions using MS Power BI Desktop/Service sourcing data from Azure SQL DW and Embedded into the ABC-AIO (React JS) Application using components such as .NET Server, HTML/JS, Azure Key Vault and Power BI REST API
  • Involved in designing the LDM and PDM for the ABC-AIO based OLAP reporting needs.
  • Built an Index.HTML with JavaScript to apply the Filter and Row-Level Security mimic functionality from ABC-AIO ReactJS front-end application with MSPBI Embedded report.
  • Built the PAR Optimization Summary page showing the filtered card view by Hospital based on the user logged inform the ABC-AIO React JS page and displaying the filtered list of medications with Current PAR and Suggested PAR and Total Inventory On-Hand and Safety Stock metrics using DAX calculations.
  • Built an alternate solution for PAR Optimization Detail page with python -matplotlib with 8-week rolling Actual versus Forecast based on Vend and Refill Transaction Types, Average Daily Usage and Stockouts by Last 7/30 days and integrated into MSPBI visuals.
  • Worked on the Inventory Tracking Levels by Drugs, Location and Machines displaying Current On Hand, Days on Hand, Vend Velocity and Average Daily Usage involving complex DAX calculations.
  • Familiar with the latest integrations of MSPBI into MS Flow, MS Teams
  • Worked on POC for ReactJS to embed the MSPBI Report by installing Node.js and configuring the required MSPBI Report parameters within App.JS

We'd love your feedback!