We provide IT Staff Augmentation Services!

Azure Data Engineer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • 11 years of IT experience in Design and Development of Data Warehousing projects with involvement in all phases of Software Development Life Cycle as Big Data Engineer in Finance, Sales and Marketing industries.
  • Proficient in building pipelines with GUI and transforming data using Azure Data Factory (ADF V1, V2) and monitoring produced and consumed data sets of ADF.
  • Proficient in creating linked services on the source as well as destination servers in Azure.
  • Proficient in creating Stored - procedures and automating them in Azure Environment.
  • Proficient in creating Data Factories and dependencies of activities in Azure Data Factory.
  • Proficient in U-SQL, automating scope scripts to generate daily/weekly/monthly/onetime streams using XLFOW jobs, SQLIZING Cosmos streams data into Azure SQL Servers in Cosmos Big Data.
  • Proficient in creating pipelines using DbAmps and PySpark for importing data from SFDC, MS Dynamics to MS-SQL Server.
  • Expertise in Microsoft Business Intelligence technologies like SSIS, SSRS, SSAS, SCOPE Studio and SQL server database management studio 2005/2008/R2/2012/2014 and SQL Server Data Tools (2012) with strong knowledge on SQL and TSQL concepts.
  • Expertise in handling various database activities like Data Modeling, Database Design and
  • Technical expertise in OLTP/OLAP/ODS System Study, Analysis and E-R modeling, developing OLTP database schemas using 3NF, OLAP database schemas using Star Schema and Snowflake Schema (dimensional modeling)
  • Expertise in design and development of Operational Data Store (ODS) with implementation of Change Data Capture (CDC) Concept.
  • SSAS cube development in both SQL 2008R2 and 2014 for financial planning and operational performance metrics.
  • Implementation of performance dashboards using Tableau 10.3 to analyze and measure performance of applications supporting large financial functions
  • Used SQL Profiler to monitor the server performance, debug T-SQL and slow running queries
  • Expertise in creating Flow Charts for Conceptual, Logical and Physical models
  • Expertise in turning the business requirements of data into visualization models using Erwin
  • Possess excellent Communication skills and Leadership qualities working in a team.

TECHNICAL SKILLS:

Database: SQL Server 2005/ 2008/ R2/ 2012/2014, Oracle 10g/11

Big Data: Cosmos, Hadoop, Azure Data Lake

Reporting: SSRS, Tableau, Crystal Reports, Power BI

ETL Tools: SSIS, DTS, ADF

Analysis Tools: SSAS, Alteryx

Data Modelling Tools: Erwin, Excel

Languages: MS-SQL, T-SQL, HQL, U-SQL, C#, PySpark

Tools: /Methodologies Erwin, TOAD, AQT, TFS, GIT Hub, Hive, Jenkins

WORK EXPERIENCE:

Confidential

Azure Data Engineer

Responsibilities:

  • Designing and Configuring Azure Cloud relational servers and databases analyzing current and future business requirements.
  • Designing and developing CI/CD process using Azure Devops.
  • Designing and creating Data Marts, Databases, Indexes, Views, Aggregations, Stored Procedures, Partitions and Data Integrity.
  • Migrating Data from On-Perm SQL Server to Cloud databases Azure Synapse Analytics(DW) and Azure SQL DB
  • Performing research to identify source and nature of data required for ETL solutions.
  • Developing Tabular Models on Azure Analytics services
  • Developing Azure Data Factory pipelines to extract and manipulate data from Azure Blob storage/Azure Data Lake/ SQL Server on cloud.
  • Extracted data from different ERP source systems, OLTP servers, Cloud storage using SQL Server Integration Services and Azure Data Factory.
  • Transforming Extracted data into Mlti-Dimensional cubes, Azure Synapse for building DataMarts to develop reports using PowerBI.
  • Developing custom stored procedures for delta loads, functions, triggers using SQL, T-SQL on cloud SQL server/Azure Synapse.
  • Performing research to identify source and nature of data required for ETL solutions using Azure Databricks.
  • Migrate the entire CRM database present on the IBM DB2 servers, Informix to cloud based data warehouse called Snowflake using ETL DataStage.
  • Performing Data Validations of the DW using Power BI.
  • Identified data quality issues and design processes and procedures to improve/maintain quality.
  • Performance tuning of SQL queries, Data Pipelines, Tableau and Power BI Dashboards.
  • Maintaining version control of code using Azure Devops and GIT repository.
  • Performing Code release from one environment to other environment using release management in Azure Devops

Environment: Azure Data Factory, Azure Data Lake, Azure Synapse Analytics(DW), Azure Devops, Snowflake, PowerBI, SharePoint, Windows 10

Confidential

Azure Data Engineer

Responsibilities:

  • Building ADF pipelines to extract and manipulate data from Azure Blob storage/Azure Data Lake/CosmosDB/SQL Server on cloud.
  • Building pipelines using Azure logical apps to extract and manipulate data from Sharepoint.
  • Developed scope script using U-SQL to generate daily, monthly structured streams/ORC files in Cosmos.
  • Enhancing Multi-Dimensional Revenue Data Cube from ORC files generated in cosmos using TITAN for Business solutions.
  • Maintaining and troubleshooting existing Power BI dashboards in case of data refresh failures which are owned by our team.
  • Responsible for Windows VM patching on AZURE.
  • Performing research to identify source and nature of data required for ETL solutions using Azure Databricks.
  • Extensively used Azure Databricks for data validations and analysis on Cosmos structured steams.
  • Developing job monitoring alerts for job failures and latency using scope script.
  • Extensively used Dragonfly Job Scheduler for scheduling, monitoring the Scope jobs on daily/monthly basis.
  • Experience working with Talend ETL and Snowflake cloud database systems.
  • Performing Data Validation between HDInsights cluster & Target SQL Server using Hive SQL.
  • Highly knowledgeable in concepts related to Data Warehouses schemas, Star Schema modeling and Snowflake modeling and processes.
  • Developing custom stored procedures for delta loads, functions, triggers using SQL, T-SQL on cloud SQL server.

Environment: Cosmos, Scope Studio, U-SQL, C#, Azure Data Factory, Azure Data Lake, Azure Databricks GIT Hub, Snowflake, Iris Studio, Cauce, Kensho, SharePoint, Windows 10

Confidential

Data Engineer

Responsibilities:

  • Responsible for designing and creating a technical specification document by decoding existing EDC (Enterprise Data Cube) system.
  • Performed research to identify source and nature of data required for ETL solutions.
  • Extracted data from SFDC (Salesforce Data Center) source system and loaded into SQL Server 2017 using DBAmps.
  • Added new dimension attributes & KPIs to existing Multi-Dimensional Enterprise Data Cube as requested by business stakeholders.
  • Extracted Code from existing Tableau & Power BI reports which are frequently used across the organization for EDC - UDA migration.
  • Worked with data warehouse architect to identify data quality issues and design processes and procedures to improve/maintain quality.
  • Extensively used Cisco Tidal Job Scheduler for scheduling, monitoring the SQL jobs on daily/weekly/Monthly basis.
  • Performed Data Validation between Hadoop Source systems & Target SQL Server using Hive SQL.
  • Designed and created Data Marts, Databases, Indexes, Views, Aggregations, Stored Procedures, Partitions and Data Integrity.
  • Developed custom stored procedures for delta loads, functions, triggers using SQL, T-SQL.
  • Performed performance tuning of stored procedures using SQL Server Profiler
  • Used Query Analyzer, Profiler, Index Wizard and Performance Monitor for performance tuning on SQL Server

Environment: SQL Server 2017, Tableau, Power BI, SFDC, SQL, T-SQL, Hive, Microsoft Business Intelligence Studio, Microsoft Visual Studio Data Tools, Hue, Power BI, Tableau, SQL Profiler, Database Engine Tuning Advisor, Jira, GIT Hub, SharePoint, Windows 10, Tidal

Confidential

Data Engineer

Responsibilities:

  • Responsible for maintaining Quality data in Cosmos streams by performing operations such as Cleaning, Transformation and ensuring Integrity in a relational environment.
  • Responsible for creating data streams in Cosmos and Data Sets in SQL Server to support Data Science research.
  • Responsible for creating Activities in WorkFlowDefinition files to automate Scope script and XML files.
  • Responsible for Sqlizing the Cosmos streams data through XML using XFLOW jobs on a daily and monthly basis.
  • Initiated a lot of Finetuning mechanisms to tune the database as well as the queries to complete a set of given jobs or tasks in optimal time
  • Designed and prepared interactive and intuitive year end dashboards and reports using Power BI to show the sales performed by the traders this year.
  • Performed Data Validation of the DW using Power BI.
  • Extensively used GIT Hub for source code check- ins and version controls.
  • Responsible for source code reviews in GIT Repository.
  • Designed and created Data Marts, Databases, Indexes, Views, Aggregations, Stored Procedures, Partitions and Data Integrity.
  • Involved in the development of custom stored procedures, functions, triggers using SQL, T-SQL.
  • Installed and administered Microsoft SQL Server 2014, Visual Studio 2015, SQL Server Data Tools 2015.
  • Performed performance tuning of stored procedures using SQL Server Profiler
  • Used Query Analyzer, Profiler, Index Wizard and Performance Monitor for performance tuning on SQL Server

Environment: Cosmos Big Data, Scope Studio, Power BI, SQL Server 2016, T-SQL, Microsoft Business Intelligence Studio, SQL Profiler, Database Engine Tuning Advisor, Microsoft Team Foundation Server, GIT Hub, SharePoint, Windows 10, XFLOW, Teams

Confidential

Senior BI Developer

Responsibilities:

  • Being a MSBI Developer I am responsible for coordinating planning and as part of a team for gathering and analyzing Business Requirements and ensuring work is completed as scheduled.
  • Performed research to identify source and nature of data required for ETL solutions.
  • Developed Enterprise Data Warehouse using Kimball’s methodologies to extract the data from ODBC source and loaded to MSSQL Server 2014 (MDW)
  • Performed testing of ETL data flows
  • Performance tuning of SQL queries, SSIS package and Tableau Dashboards.
  • Worked with data warehouse architect to identify data quality issues and design processes and procedures to improve/maintain quality.
  • Designed logical and physical ER data Models by using Erwin tool.
  • Extracted data from Oracle, Amazon Redshift, S3, Salesforce Cloud and migrated into SQL server using SSIS.
  • Developed query-based look ups to bring the latest data in to the fact tables as per the Effective date.
  • Tested, Cleaned and Standardized Data meeting the business standards using exact lookups using SSIS tasks.
  • Extensively used SSIS task server& SQL server Agent for deploying SSIS packages & scheduling, monitoring the jobs on daily/weekly/Monthly basis.
  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
  • Implemented roll back feature at the package level to restart the package when it is failed due to any technical issues in the Production server.
  • Created OLAP cubes in SQL Server Analysis Services 2014 for company wide use. The data being compared in the cubes had never been available together in the same application previously.
  • Experienced in Building Cubes with different Architectures and Data Sources for Business Intelligence using SQL Server 2014.
  • Designed Dimensional Modeling using SSAS packages for End-User. CreatedHierarchies in Dimensional Modeling.
  • Developed Aggregations, partitionsand calculated members for cubeas per business requirements.
  • Defined appropriate measure groups andKPIs&deployed cubes.
  • Optimized SSAS performance by using different storage methods like ROLAP, MOLAP, HOLAP etc.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
  • Performed Data Validation of the DW.
  • Created Data Dictionary and data definitions of DW tables using Alteryx Connect.
  • Designed and created Data Marts, Databases, Indexes, Views, Aggregations, Stored Procedures, Partitions and Data Integrity.
  • Involved in the development of custom stored procedures, functions, triggers, SQL, T-SQL.
  • Installed and administered Microsoft SQL Server 2014, Visual Studio 2015, SQL Server Data Tools 2015, Attunity.
  • Performed performance tuning of stored procedures using SQL Server Profiler
  • Used Query Analyzer, Profiler, Index Wizard and Performance Monitor for performance tuning on SQL Server

Environment: SQL Server 2014 Enterprise Edition, SSIS, T-SQL, Microsoft Business Intelligence Studio, SQL Profiler, Database Engine Tuning Advisor, AQT, TOAD, Alteryx, Microsoft Team Foundation Server, Microsoft Visual Studio Data Tools, Attunity, Tableau, Alteryx Connect, Alteryx designer, AmazonS3, Amazon Redshift, Windows 7/10 Enterprise Edition

Confidential

MSBI Developer

Responsibilities:

  • Actively involved as part of a team for gathering and analyzing the needs of End User Requirement and System Specification.
  • Reused packages and performed timely executions for loading the data into the database as required.
  • Created packages using Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task, Script task, script component using C# /vb.net.
  • Used temp tables to reduce the number of rows for joins, to aggregate data from different sources for different reports.
  • Designed logical and physical ER data Models by using Erwin tool.
  • Performed research to identify source and nature of data required for ETL solutions.
  • Performed testing of ETL data flows
  • Worked with data warehouse architect to identify data quality issues and design processes and procedures to improve/maintain quality.
  • Created ETL/SSIS packages to extract data from various sources like SQL Server database, Excel, Flat files to OLAP systems using different SSIS components and used dynamic configuration files and variables for production deployment.
  • Performed data cleansing by creating tables to eliminate the dirty data using MS SQL Server Integration Services 2008 (SSIS).
  • Worked with the software development team to design, develop and deploy reports that improve access to financial and operational data using SSRS.
  • Generated Sub-Reports, Drill down reports, Drill through reports and Parameterized reports using MS SQL Server Reporting Services 2008 (SSRS).
  • Worked with advanced data regions like Tablix, Matrix, Gauges, Charts, Rectangle, and Textboxes in MS SQL Server Reporting Services (SSRS).
  • Developed Dashboards using Tableau and integrated Scorecards and Reports to the dashboards.
  • Involved in testing, bug fixing and documentation for the project. Used Query Analyzer, Profiler, Index Wizard and Performance Monitor for performance tuning on SQL Server
  • Created database objects like tables, views, schemas, indexes, views and stored procedures.
  • Involved in the development of custom stored procedures, functions, triggers, SQL, T-SQL
  • Optimized the performance of queries with modification in TSQL queries, established joins and created indexes.

Environment: MS BI (SSIS, SSRS, SSAS, SQL Server) 2008R2/2012, T-SQL, SQL Server Management Studio, SQL Profiler, Database Engine Tuning Advisor, Microsoft Team Foundation Server, Microsoft Visual Studio Data Tools, Windows 2000/XP/7 Enterprise

Confidential

MSBI Developer

Responsibilities:

  • Clearly focused on the business requirements and legacy requirements of the project.
  • Involved in the design, development and testing phases.
  • Development of mapping specifications using SSIS.
  • Designed and developed complex canned reports using SQL Reporting Services.
  • Implemented Drill through to reports by passing Multiple Parameters from one report to other.
  • Designed various prompts and cascading options to reports.
  • Involved in deploying the reports using web link and set security levels to various users to access the reports.
  • Designed and developed filters, look up transformation rules (business rules) to generate data using SSIS.
  • Designed and Developed processes to fetch data from different databases placed across different servers using variables.
  • Designed and developed jobs using SQL Server Agent for Scheduling of packages and Creation of jobs sequences for automation.
  • Imported Data from various sources like DB2, Oracle, Excel, XML, SQL Server, Flat files.
  • Detailed Documentation of all the existing and new processes, Created and Updated Data Dictionaries.
  • Sending mail after successful completion of the whole process.

Environment: MS BI (SSIS, SSRS, SSAS, SQL Server) 2008R2, T-SQL, SQL Server Management Studio, SQL Profiler, Database Engine Tuning Advisor, Microsoft Team Foundation Server, Microsoft Visual Studio Data Tools, Windows 2000/2003/XP

We'd love your feedback!