We provide IT Staff Augmentation Services!

Azure Data Engineer Resume

0/5 (Submit Your Rating)

ChicagO

SUMMARY

  • 12+ years of experience in the field of IT industry with Strong perseverance and diligence towards attaining challenging goals.
  • Profound experience in working with clients closely, project teams, Senior Management and efficient in managing Onsite - Offshore Model and having knowledge on Data warehousing Applications.
  • Have onshore experience in Paris, FRANCE as an Implementation Specialist for a major manufacturing company ( Confidential ), where I have done rollouts for core business applications.
  • Design and implement end-to-end data solutions (storage, integration, processing, visualization) in Azure.
  • Design, implement medium to large scale BI solutions on Azure using Azure Data Platform services (Azure DataLake, Data Factory, Data Lake Analytics, Stream Analytics, Azure SQL DW, Databricks).
  • Experience in notebooks and reuse these data engineering routines written in Python.
  • Experience in creating Pipelines in ADF using Linked Services/Datasets/Pipeline to Extract and load data from various source systems like (Flat Files, Avro, Json, Parquet, csv files and Relational tables from heterogeneous databases like SQL Server and DB2) from On-Prem systems to Azure PaaS services like Azure SQL, ADLS, Blob storage, Azure SQL Data warehouse to handle Full and Incremental data loads.
  • Designing and implementing Microsoft Business Intelligence solution using BI Semantic Model and in SQL Server BIsuite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI and experience on using SQL tools like TOAD, SQL Server to run SQL Queries and TSQL.
  • Worked on different client specific environments related to Health, and Insurance modules supporting both Data Warehousing and Data Integration systems.
  • Core competency in designing (Conceptual, Logical & Physical) Data Model while Interacting with Business user to get the requirement.
  • Experience in real-time data ingestion using Azure Event Hub.
  • Worked with different database systems like Oracle, SQL Server, DB2, SharePoint, Azure SQL Database
  • Develop dashboards and visualizations to help business users analyze data as well as providing data insight to upper management with a focus on Microsoft products like SQL Server Reporting Services (SSRS) and Power BI.
  • Successfully completed several SSRS Reporting Projects and PowerBI Projects.
  • Knowledgeable of DevOps fundamentals (development, testing, integration, deployment, and monitoring of the software throughout the lifecycle).
  • Instrumental in user requirement gathering and converting those to the technical solutions to meet the business expectations.
  • Build Complex distributed systems involving huge amount data handling, collecting metrics building data pipeline, and Analytics.
  • Extensively worked Agile, Scrum, Software Development Life Cycle (SDLC) methodologies and related environments.

TECHNICAL SKILLS

Azure Cloud: Azure Data factory, Azure Data Lake, Azure DataBricks, Azure SQL Database, Azure Synapse Analytics, Active Directory, Azure Monitoring, Azure Search, Azure Event Hub, Azure Service Bus, Key Vault, Azure Analysis services, Spark, Azure Stream Analytics, Azure StorageAzureAnalysisServices.

ETL Tools: Azure Data Factory (ADF), SSIS

Databases: Cosmos DB, Azure SQL Database, Azure SQL Data Warehouse, Netezza, DB2 and Lotus Notes

BI Tools: BIDS, Report Builder, Power BI

Scripting Languages: Python, Scala, PowerShell, VB Scripting, HTML, C#, SQL, PySpark, T-SQL, Lotus Script

ITSM Tools: Servicenow and BMC Remedy

Tools: GIT, TFS, Azure DevOps

PROFESSIONAL EXPERIENCE

Confidential, Chicago

Azure Data Engineer

Responsibilities:

  • Analysis of tables that will be directly extracted and loaded into staging tables and design the staging DB and Target DB in hybrid Environment.
  • Design and Implementing the Data Migration framework with appropriate data load process and sequencing (one- time and incremental loads) using Azure Data Factory, Azure Blob Storage and orchestrate data from On-Premises systems using Azure Data Factory by creating Azure Data Pipelines and Data Flows.
  • Working with Azure Data Factory Data transformations such as mapping dataflows, Databricks(PySpark +SQL).
  • Working with Azure Data Factory Control flow transformations such as For Each, Lookup Activity, Until Activity, Web Activity, Wait Activity, If Condition Activity.
  • Closely work with stakeholders to recommend and design Azure SQL database and Azure Data Lake storage solutions.
  • Have good experience working with Azure Blob and Azure data lake storage and loading data into Azure SQL Synapse Analytics (DW).
  • Writing SQL stored procedures to transform the table into the target data model and load it into target tables.
  • Calling Notebook Activity from Azure Data Factory.
  • Experience in load and transform data using the Apache Spark Python (PySpark) in Azure Databricks.
  • PySpark is used to build ETL pipelines for large datasets.
  • Configuring Secret Scope in Databricks Cluster using Azure Key Vault.
  • Understand the business processes and application functionality relevant to their area as well as an understanding of related applications in adjacent areas.
  • Code reviews to Application Developer and ensure that all programming standards and policies are adhered to.
  • Application Development Lead taking part in the entire life cycle - starting from Requirement Analysis, Design, Coding, testing, implementation, and performance tuning.

Environment: Azure Data Factory, Azure Blob Storage, Azure SQL DB, Azure Synapse, Azure Data Bricks, ADLS GEN2, PySpark, Blob Storage and PowerBI.

Confidential, West Des Moines

Azure Data Engineer

Responsibilities:

  • Responsible for planning, Development, Delivery and Operation Phase of Environment and Applications in Azure Cloud using Azure DevOps tools.
  • Responsible for ingesting data from various source systems (RDBMS, Flat files, Big Data) from On-Premise systems into Azure (Blob Storage) using framework model.
  • Scheduling jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
  • Perform analyses on Data quality and apply business rules in all layers of data extraction transformation and loading process. Analyzing, design and build Modern data solutions using Azure PaaS service to support visualization of data. Understand current Production state of application and determine the impact of new implementation on existing business processes.
  • Creating pipelines, data flows and complex data transformations and manipulations using Azure Data Factory (ADF) with Data bricks.
  • Experience in creating tables dynamically and adding columns in delta tables using PySpark.
  • Deploying Pipelines and SQL objects from Dev to UAT and UAT to Prod environments in DevOps.
  • Working sessions with SMEs to develop Small POCs using, Power BI.

Environment: Data factory, Azure DataLake, Azure SQL Database, Azure Synapse Analytics, Active Directory, Application Insights, Azure Monitoring, Azure Search, Data factory, Key Vault, PowerBI.

Confidential, BOSTON

Data Analyst

Responsibilities:

  • Develop database design for Agency Data, User Information, Eligibility Information, Application Process Data, Document/Notices information and Payment information.
  • Implement the packages to extract data from legacy system (On-Premise) to new systems using SSIS packages.
  • Involved in system analysis & design of the Enterprise data warehouse implementation, Requirements gathering and understanding the business flows.
  • Designed the Functional Requirements and Mapping Technical Specifications on the basis of Functional Requirements.
  • Migrated the data from Legacy systems to SQLServer2017.
  • Worked with Product Owner to understand functional requirements and interact with other cross-functional teams to design, develop, test, and release features.

Environment: SSIS, SQL Server 2017

Confidential, Atlanta, GA

Senior Software Engineer

Responsibilities:

  • Experience on migrating on Premises ETL (SSIS) process to Azure.
  • Designed ETL Packages with Netezza/SharePoint DB (On-Premises) as a data source (Flat files) and loaded the data into target data sources by performing different kinds of transformations using Azure Data Factory.
  • Implementing Pipelines in ADF using Linked Services/Datasets/Pipeline to Extract and load data from different sources like Azure SQL, Azure Synapse, ADLS, Blob storage, Azure SQL Data warehouse.
  • Designed dynamic SSIS Packages by passing project/Package parameters to transfer data crossing different platforms, validate data during transferring, and archived data files for different DBMS.
  • Sensors streaming data to the Data Lake using Azure Stream Analytics
  • Integrating Power Bi with Power Apps.
  • Migrated QlikView reports to PowerBI.
  • Created U-SQL script for transform activities and developed complex queries to transform the data from multiple sources and outputted the data into Azure Data warehouse.

Confidential

Software Developer

Responsibilities:

  • Involved starting from Requirements gathering, giving the Estimation, preparing SRS, walk through the SRS with the customer and getting the SRS approval, preparing the Design document, Coding, Unit Testing, supporting System Testing and UAT, sending the UAT approved template for deployment to the deployment team and providing Post deployment support.
  • The duties involve Designing Forms Views, Agents, Actions, Outlines, Pages, and Framesets. Using JavaScript, HTML, Ajax, XML, Lotus Script, and Formula Language for automation in Lotus Notes Client and web-based applications. Lotus Notes, Microsoft 365 Global Roll Outs - Power Platform Setup:
  • Providing Cloud Migration Solutions to Migrate Lotus Notes Applications.
  • Designed & Developed Canvas Power apps with complexities such as multiple Approval levels, screens, SharePoint, Power Automate, Excel Online etc.
  • As part of Wipro COE, we have implemented Medium complex Canvas app integration with Data Verse, SharePoint and Power Automate.
  • Worked on Simple Medium complexity Canvas apps, Model-driven apps with different Data Sources, Reusable Components, default connectors, Solutions, Power BI models etc.
  • Created SharePoint List form customizations using power apps with multiple Approval levels, Export Excel, Print to PDF Functionalities.
  • Migrated LN documents to SharePoint Online list/library, Microsoft Data Verse.
  • Adopted Microsoft 365 Out-of-the-Box features for the solution implementation.
  • Setup the Power Platform: -Re-develop LN forms to Canvas & Model-driven Power Apps based on complexity of the application. i. High Complexity App development: - Development of Raw Material request, DDM Code request, multiple levels of approvals, integrated with Power Automate, High number of controls & calling of multiple forms. ii. Medium Complexity App development: - Development of Requisition forms with lookup list/cascading control, integrate with power automate. iii. Simple App development: - Standard Canvas app development to submit data on to SharePoint List.
  • Performing unit testing on the data migrated and execute PowerShell scripts for any data transformations.

We'd love your feedback!