We provide IT Staff Augmentation Services!

Sr. Azure Data Architect/etl Resume

3.00/5 (Submit Your Rating)

Allendale, NJ

SUMMARY:

  • Nick Kletnoi brings over 19 years of experience in data warehousing development with a strong background in many related aspects of this area of technology that range from designing complex POCs on multi - fact table star schemas that collect important BI KPIs - to developing industry-standard SSIS ETL and ADF packages that perform intricate data pulls and populate fact and dimensional tables from a variety of source systems - to complex data-driven production engineering constructs that dynamically partition large corporate cubes with DSO and AMO and Azure cloud integration solutions; experienced in setting up and migrating to Azure SQL Server Datawarehouse;
  • Precision crafted front-end reporting solutions with Microsoft products utilizing expert knowledge of SQL, MDX and DAX languages with the latest Power-BI solutions including cloud-based SQL Azure and power pivot.
  • Extensive IT experience in various industries has been gained by hands-on implementation of very large-scale BI projects as well as scaling complex BI pilots using the most current development paradigms.
  • Expert in major Complex Financial Data Migrations and major Master Data Efforts
  • Expert in PySpark jobs and improving the performance of DataBricks notebooks;
  • Expert in modeling star and snowflake Data-Warehousing Solutions with SQL Server 2019 and Analysis Services 2019 (SSAS/Tabular) as well as previous versions: Expert In PowerBI and PowerBI embedded; SSAS 2016, SSAS 2014 and earlier, SSRS translating complex business requirements into valid scalable schemas, with Python; AWS Glue;
  • Ultimate Complexity SQL Expert, extensive experience using with Cloud technologies: Azure Data Lake Analytics
  • SQL Azure plus cloud-based Data warehousing solutions with SSAS/Tabular, Complex DAX metrics enablement;
  • Data ETL SSIS 2017 (Expert in complex data migrations with fail-safe, multi-step data movement), Complex multi-level and multi-step auto data movement with multiple fail-safe flow mechanisms; authored an ETL Framework;
  • Extensive experience developing advanced data movement with the use of complex SSIS Packages and ADF.
  • Architecture of cubes and data-mart load of both SSAS 2014 and SSAS 2017, data-modeling with Erwin 10.1
  • Expert in 50 Terabytes+ Cube Production Engineering - Author of data-driven partitioning constructs: Azure Data-Lake and Hadoop: Horton-Works HDC in Azure and Azure Data-Warehouse in Cloud; Azure Data Lake Analytics
  • Data-Factoring SSIS packages into the cloud and designing cloud machine learning components & U-SQL
  • Expert Knowledge of Financial Reporting and Financial Close Procedures and Revenue Recognition with major ERP packages: Navision, SAP ECC and Peoplesoft, with SQL and Python doing data-movement; using Azure streaming;
  • Large Database design both OLAP and OLTP (data-marts and subject areas) with experience migrating to Cloud.
  • Extensive experience in programming T-SQLPL/SQL and writing complex stored procedures, complex relational and dimensional data modeling, DB2 SQL, Teradata SQL, Oracle PL/SQL with Oracle Cloud migrations.
  • Expert in Complex Solutions with MDX, DAX and multi-dimensional math modeling with MDX, DAX.
  • Extensive experience with translating complex business requirements into multidimensional and tabular models.
  • Hands-on Maintaining of Large Data-warehouses in RedShift, Scaling complex pilots to Azure Data-warehouses.
  • Expert level understanding and experience with intricate data-design industry specific data-warehouse patterns, expert in redesigning ‘unworkable’ designs, your ‘models that work’ Expert; improving Databricks performance;
  • AWS S3 storage setup, orchestration of AWS environment with Cloud-Formation component of AWS, JSON scripts expert and extensive knowledge of MongoDB as a repository for JSON files, HIVE programming and scripting.
  • PIG for ETL and using PIG UDF functions in Python to extend PIG processing complexity, DevOps, TFS server.
  • Microsoft Reporting Services reports with SQL, MDX and DAX, Power-BI and Data-Warehouse in Azure.
  • Auxiliary Accounting: GAAP, GL, AR, AP, cost-accounting, financial reports, cost allocation methodologies, Fixed and variable cost calculations, cost drivers, profitability analysis, forecast modeling, Operational Budgeting.
  • Expert Azure DevOps: Heavy experience in setting up AWS environments using Terraform and AWS Could Formation tools for the web, expert in narrowing agile code-to-productions deployment windows by fostering collaboration between teams of developers and using the most up-to-date TFS package-deployment facilities for code operations, effective use of these AWS cloud solutions for DevOps; Azure DevOps expert & Databricks;
  • More than 19 years OLAP experience in Industries: Financial, HealthCare, Food, Mortgage, Manufacturing, Entertainment, Retail, Software Consulting, Government, Distribution, high-tech, subscription business.

WORK EXPERIENCE:

Confidential, Allendale, NJ

Sr. Azure Data Architect/ETL

Responsibilities:

  • Architecting and directing PowerBI Reports and PowerBI Visuals and PowerBI Strategic Deployment; PowerBI dashboards and Azure SQL Server strategy for the Craniomaxillofacial (CMF) division of this Global Client; data-centric and special ‘case analysis’ frameworks POC and design to help streamline component CMF costs; design of a self-serve SSAS cube for this purpose; Designed an PowerBI Refresh API call scripts in PowerShell to trigger PowerBI refreshes; providing direction for backend tables and backend design components; PowerBI embedded; development of the SSAS ad-hoc Azure Service and cloud components: SSRS to PowerBI migrations, Info-path to PowerApps migrations; use of Azure DevOps for code rollout and stress-testing PowerBI reports;
  • ADF for advanced data movement with U-SQL; Azure SQL Database or SSAS for the main data storage; migration of SSAS from on-prem to Azure Cloud SSAS; Azure Blob Storage for PBI Reports;
  • Azure Data Lake Storage for archival and long-term comprehensive storage plus enablement Azure Lake Analytics; hands on coding U-SQL for movement of data from Data Lake into Azure SQL; Databricks notebooks
  • Development of AFD packages by way of SSIS runtime conversion; setup of Azure machine learning models for predicting component malfunctions; algorithm effectiveness; ETL and enabling Azure Search; Spark jobs
  • Azure Automation scripting for security and VM deployments using the ARM in PowerShell; PowerBI Refreshes with PowerShell; Setting up Azure HDInsight clusters with PowerShell;processing REST API’s with PowerShell;

Confidential, Washington, DC

Sr. Data Architect/ETL - Cloud and Big Data

Responsibilities:

  • Designed, Architected and Implemented a Revenue Recognition data load Engine for satisfying SEC regulatory compliance requirements 605 and 606 for Confidential, complex technical requirements; Designed PowerBI Reports and effective visual PowerBI dashboards and PowerBI embedded components; PowerShell;
  • Provided source code for balance data migration from the accounting system of record (Navision) to Oracle ERP Cloud, modules covered were: GL, Fixed Assets payables, receivables using SQL for ETL; automation;
  • Architected a comprehensive ‘Product SKU mix’ (Type II history) data-engine in Python that defined complex sets of product mixes (15k variations) sold over time and calculated Fair Value(FV) for those mixes over time for Revenue Recognition and reporting purposes, this was the crucial and pivotal part of the entire $3 million project, created and kept master tables for those mixes to perform lookups and assign new Mix ID’s each day, complex operation to manipulate data using SQL and Python, Architected PowerBI approach; improving performance of Databricks;
  • Designed an ETL framework for efficient and automatic data-loading into the Rev-Rec system from Navision, resolved complex hierarchy issues for facilitating these loads, straight-line rev-rec.
  • Designed and provided a Finance cube with Power-BI reports all uploaded to the cloud which merged together two very complex source systems, uploaded source data to AWS S3 storage in cloud, used SQL, wrote functions in Python to process and modify data, architected a PowerBI strategy; Databricks notebooks;

Confidential, Sterling, VA

Sr. Financial MSBI Consultant/ETL - Cloud and Big Data

Responsibilities:

  • Architected, developed and deployed a SSRS framework reporting against SAP ECC source system of Confidential, involved in modeling and prototyping a total SSAS cube solution (Analytics engine); PowerBI dashboards;
  • Delivered 37 Financial reports as part of a Major Data Migration Effort to Microsoft Dynamics NAV for areas spanning GL ledger accounting, financial facility, long-term loan and other financial products, facilitated front and back office report integration into one cohesive solution using Microsoft Power-BI components, involved with dashboard prototyping with PowerBI using Navision as the source system of record, designed custom components for PowerBI reports for Dynamics NAV, Navision master data; configured Type II Dim for use in SSAS cube;
  • Owned the BRD (business requirements document along with a BDG (business data glossary) reflecting business and technical definitions of all of the required element from the old and new systems as part of the data Migration effort and the requirements for the entire ETL processes, used SQL; coded type II history of the Product Dim;
  • Use cloud-based Hadoop solution for batch processing and developed machine-learning solutions by integrating code from several systems, backend data tables were modeled and kept in AWS RedShift with Python;
  • Prototyped an incremental Data warehouse ETL architecture flow for backend nightly data pulls from source system and integrated a Power-BI executive dashboard solution, used SQL with Python UDF functions to enable the load into the data-warehouse backend that ran this dashboard, also used SQL for PowerBI report development;
  • Developed an Azure Machine Learning algorithm for ‘Most likely Financial Product Cross-Sell’ prediction model and accessorized it with Azure Data-Factory movement package for uploading data to the cloud and Exposing reports;
  • Used Informatica 9.5 PowerCenter to created data-movement packages and then schedule the ETL.

Confidential, Washington, DC

Sr. Budget Solutions Consultant/ETL

Responsibilities:

  • Singlehandedly produced a Full-Functionality POC (proof-of-concept) solution on a High-Profile SAP to MSBI conversion project in 8 weeks flat, this involved producing a complex financial/budgeting 2014 SSAS prototype cube and a Power-BI front dashboard to demonstrate effective replacement of a portion of existing SAP ECC reporting for a large Government Organization (8th largest county in the US by revenue), existing solution was based on SAP BW and was developed over a long period of time, presented POC was for one module only - FM (Funds Management), Client was thrilled with proposed POC solution as presented and decided to expand to all of the other ECC modules (FM, GM etc.); configured and coded Type II historical dimension for the main government cost center dimension;
  • Designed, reviewed, implemented and optimized data transformation processes in the Hadoop ecosystems including developing Python scripts to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW, migrated existing SSIS packages to data-factory cloud; designed PowerBI reports;
  • Served as Chief ETL Architect and Developer and technical lead for the SQL side of the ETL processing that populated the data warehouse and the SSAS Cube, authored all of the stored procedures that populated the cube. Served as Chief Cube Designer and worked with the Budget Reporting Group users to enhance and advance the usability of the SSAS cube from POC stage phase to full Production phase which involved directly collecting requirements in real-time, coming up with solutions/ways of implementing new features, created apps and ADD in Azure to register the app in Azure; created PowerBI dashboards and advanced PowerBI automations;

Confidential, Philadelphia, PA

Sr. MSBI Consultant - Healthcare Costing

Responsibilities:

  • Helping Confidential move their existing physical data warehouses into the cloud space (Azure).
  • Authoring the SSAS cube and backend (table) design for a comprehensive Member Analytical Engine to be used for advanced member/prospect segmentation and full ad-hoc costing analytics engine with capabilities to include advanced analysis of the following areas: patient utilization analysis, prospect lift analysis, treatment effectiveness analysis, provider rankings with special concentration on costing; coded Type II historical analysis for the main cost structures in the Provider dimension;

We'd love your feedback!