We provide IT Staff Augmentation Services!

Senior Azure Data Engineer/ Etl Developer Resume

2.00/5 (Submit Your Rating)

Irvine, CA

SUMMARY

  • Data Engineer with 6+ years of experience in building data applications using Microsoft BI platform and Microsoft Azure
  • 2+ years of experience in Azure Cloud, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Azure Analytical services, Azure Cosmos NO SQL DB, and Data bricks
  • Strong development background in creating pipelines, data flows and complex data transformations and manipulations using ADF with Databrick
  • Experience in teh On - premise to Cloud implementation project using ADF and Python scripting for data extraction from relational databases or files.
  • Well versed experienced in creating pipelines in Azure Cloud ADFv2 using different activities like Move &Transform, Copy, filter, for each, Data bricks etc.
  • Proficient in Microsoft BI platform technologies including SQL Server, SSIS, SSRS, Azure Data Factory, and Power BI
  • Have Experience in designing and developing Azure stream analytics jobs to process real time data using Azure Event Hubs, and Service Bus Queue
  • Extensive experience in reporting objects like Hierarchies, Filters, Calculated fields, Sets, Groups, and Parameters in Tableau
  • Experience in creating different visualizations using Bars, Lines, Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables
  • Highly proficient in teh use of T-SQL for developing complex stored procedures, triggers, tables, user functions, user profiles, relational database models and data integrity, SQL joins and query writing
  • Proficient in Logical and Physical database design & development (using Erwin, normalization, dimension modeling, Data Modeling, and SQL Server Enterprise manager)
  • Excellent knowledge in designing and developing Data Warehouses, Data marts and Business Intelligence using multi-dimensional models for developing SSAS Cubes using MDX
  • Experience writingShell, andPythonscripts to automate teh deployments.

TECHNICAL SKILLS

Database: SQL Server, Oracle, MySQL, Cosmos, Azure SQL Data Warehouse

Reporting: Power BI, SSRS, Tableau

ETL: SSIS, Azure Data Factory

Analytics: SSAS, Azure Data Lake, Databricks

Programming: SQL, Python, C#

PROFESSIONAL EXPERIENCE

Confidential, Irvine, CA

Senior Azure Data Engineer/ ETL Developer

Responsibilities:

  • Worked on all teh Azure data factory pipeline with different cases me.e. Truncate load, Incremental load, Insert Update load and automate them as per teh business requirements
  • Worked on handling/automating failure of data pipelines by using Logic App and configuring SQL logics
  • Optimization of all teh tables and creating Data models based on business logic
  • Automation for data validation for all teh tables by creating dynamic store procedures and views.
  • Leveraged Azure Cloud resources - Azure Data Lake Storage Gen2, Azure Data Factory, StreamSets(SDC), and Azures Data warehouse to build and operate a centralized cross-functional Data analytics platform
  • Responsible to manage data coming from different sources and loading of structured and unstructured data
  • Worked on different files like CSV, JSON, Flat, fixed width to load teh data from source to raw tables
  • Used Python for performing Data cleaning and preparation on structured and unstructured datasets
  • Created ETL pipelines in Python and PySpark to load data into Hive tables under Databricks
  • Create automation and deployment templates forrelationalandNoSQLdatabases includingMSSQLandCosmosDBinAzure using Python
  • Develop and maintain application code in Databricks by using appropriate data structures and algorithms for optimal performance and better storage dat increase teh speed and consistency of teh application.

Confidential, Pittsburg, PA

Azure Data Engineer

Responsibilities:

  • Utilize Azure’s ETL, Azure Data Factory (ADF) services to ingest data from legacy disparate data stores - SAP (Hana), SFTP servers & Cloudera Hadoop’s HDFS to Azure Data Lake Storage (Gen2)
  • Build complex ETL jobs dat transform data visually with data flows or by using compute services Azure Databricks, and Azure SQL Database
  • Develop and maintain various data ingestion pipelines as per teh design architecture and processes: source to landing, landing to curated & curated to process
  • Utilize Databricks Delta Lake storage layer to create versioned Apache Parquet (delta) files with transaction log and audit history
  • Build Delta Lake for teh curated layer, maintain high-quality data available for teh teams: data scientists, finance etc.
  • Work with various file formats: flat-file TXT & CSV; parquet & other compressed formats
  • Use various types of activities: data movement activities, transformations, and control activities; Copy data, Data flow, Get Metadata, Lookup, Stored procedure, Execute Pipeline
  • Work in liaison with teh business teams to gather their requirements
  • Responsible for developing ETL pipelines to meet business use cases by using data flows, Azure Data Factory (ADF), Data Lake and Azure Datawarehouse
  • Used complex data transformations and manipulations on business use cases/ requirements with Data flows, Databricks
  • Working with different data storage options including Azure Blob, ADLS Gen-1, Gen-2.
  • Developed Json Scripts for deploying teh Pipeline in Azure Data Factory (ADF) dat process teh data using teh Cosmos Activity.
  • Worked on complex T-SQL queries and SSIS jobs to process high volume data and create

Confidential

Data Engineer

Responsibilities:

  • Involved in Designing teh Data Warehouse and creating Fact and Dimension tables with Star Schema and Snowflake Schema.
  • Implemented Normalization rules in database development and maintaining Referential Integrity by using Primary Keys and foreign keys
  • Created SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, xml task and various transformations
  • Extracted data from heterogeneous sources and transformed, processed teh data using SSIS.
  • Scheduled SSIS packages for running monthly/weekly/daily feeds from and to various vendors using SQL server Agent
  • Involved in error handling of SSIS packages by evaluating error logs.
  • Successfully extracted, transformed and loaded data into data warehouse.
  • Created reports using SSRS from OLTP and OLAP data sources and deployed on report server.
  • Created tabular, matrix, chart, drill down reports, parametric reports, cascaded reports, dashboards and score card reports (SSRS) according to teh business requirement.
  • Created reports by dragging data from cubes and wrote MDX scripts.
  • Used SSIS to create ETL packages (.dtsx files) to validate, extract, transform and load data to data warehouse, data mart databases, and process SSAS cubes to store data into OLAP databases
  • Improved teh performance of long running views and stored procedures.
  • Using SQL Server Integration Services (SSIS) to populate data from various data sources Developed web based front-end screens using MS FrontPage, HTML and Java Script. Actively designed teh database to fasten certain daily jobs, stored procedures.
  • Setting up Connection Strings and connecting SQL Server AZURE Databases from locally Installed SQL Server Management Studio (SSMS) for Developers

Confidential

Database/ ETL Developer

Responsibilities:

  • Designed SSIS packages to efficiently handle heterogeneous and unusual data from various external sources
  • Designed and Developed metadata db to stored meta data such as execution log and exceptions
  • Created Data Dictionary to derive definitions and understand data elements within teh warehouse
  • Automated Exception reports generation for Customer and Products data and sending them out to source systems
  • Used various data flow components for performing Data profiling, cleansing, and auditing
  • Used Checkpoints to save teh state of teh package at failure and rerun teh package from teh point of failure
  • Implemented handling of Type 2 dimensions and inferred members using Slowly Changing Dimension
  • Secured and Configured SSIS packages for deployment to production using Package Configurations and Deployment Wizard
  • Created Event Handlers for runtime events and created Custom Log Provider in SQL Server to log those events for auditing
  • Designed and maintained cubes in MS Analysis Services in Star and Snowflake schemas
  • Created reports on top of SSAS cubes using SQL Server Reporting Services
Technologies: Technologies Used: Azure Data Factory, Azure Data Flows, Azure Blob Storage gen2, Datawarehouse(dedicated SQL Pool), Logic App, Databricks

We'd love your feedback!