Sr Azure Data Engineer Resume
Irving, TX
SUMMARY
- Around 8 years of IT experience specialize in Database Design, Retrieval, Manipulation and Support Requirements Analysis, Application Development, Testing, Implementation and Deployments using MS SQL Server 2019/2016, Oracle PL/SQL 19g/11g, MSBI.
- Around 3 years’ Experience in Datacenter Migration, Azure Data Factory (ADF) V2. Managing Database, Azure Data Platform services (Azure Data Lake (ADLS), Data Factory (ADF), Data Lake Analytics, Stream Analytics, Azure SQL DW, HDInsight/ Databricks, NoSQL DB), SQL Server, Oracle, Data Warehouse etc.
- Experience in SDLC (Software Development Life Cycle) involving System analysis, Design Development & Implementation using Waterfall methodology and agile methodology.
- Involved in development of roadmaps and deliverables to advance the migration of existing solutions on - premises systems/applications to Azure cloud.
- Experience with Azure transformation projects and implement ETL and data movement solutions using Azure Data Factory (ADF), SSIS.
- Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL Datawarehouse environment
- Implement ad-hoc analysis solutions using Azure Data Lake Analytics/Store, HDInsight
- Design & implement migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-party tools) worked on Azure suite: Azure SQL Database, Azure Data Lake (ADLS), Azure Data Factory (ADF) V2, Azure SQL Data Warehouse, Azure Service Bus, Azure key Vault, Azure Analysis Service (AAS), Azure Blob Storage, Azure Search, Azure App Service, Azure data Platform Services.
- Hands on experience on AWS tools like EMR and EC2.
- Extensive knowledge and experience in dealing with Relational Database Management Systems Normalization, Stored Procedures, Constraints, Joins, Indexes, Data Import/Export, Triggers.
- Proficient in working with large datasets andpipelines Well versed with CICDprocess.
- Knowledge and experience of AWS services such as S3, Athena, Glue, EMR/Spark, RDS, Lambda, Step Functions, IAM etc.
- Expert at data transformations like lookups, Derived Column, Conditional Splits, Sort, Data Conversation, Multicast and Derived columns, Union All, Merge Joins, Merge, Fuzzy Lookup, Fuzzy Grouping, Pivot, Un - pivot and SCD to load data in SQL SERVER destination.
- Experience using Joins and sub-Queries to simplify complex queries involving multiple tables and optimized the procedures and triggers to be used in production.
- Proficient in Query/application tuning using optimizer Hints, explain plan, SQL Trace, Index tuning wizard, SQL Profiler and Windows Performance Monitor.
- Experience in monitoring tools such as SQL Profiler, Perform, Extended Events and Third part monitoring tools like Spotlight, Performance Analysis, Idera SQL Diagnostic Manager and SQL Sentry One.
- Hands on Experience on SSIS package deployment and scheduling.
- Experience in creating various SSRS Reports like Charts, Filters, Sub-Reports, Scorecards, Drilldown and Drill-Through, Cascade, Parameterized reports that involved conditional formatting.
- Experience in report writing using SQL Server Reporting Services (SSRS) and creating several types of reports like Dynamic, Linked, Parameterized, Cascading, Conditional, Table, Matrix, Chart, Document Map and Sub-Reports.
- Designing Enterprise reports using SQL Server Reporting Services and Excel Pivot table based on OLAP cubes which make use of multiple value selection in Parameters pick list, cascading prompts, matrix dynamics reports and other features of reporting service.
- Defect and story tracking using JIRA.
- Hands on experience with Team Foundation Server (TFS), CVS, VSTS, Bitbucket, SourceTree and other version controls.
- Good team player, Self-motivated and possess ability for critical thinking and analysis and Strong inclination for finishing job before project deadline.
- Fast learner, good at teamwork and ability to adopt new technology.
TECHNICAL SKILLS
Azure Cloud: Azure SQL Database, Azure Data Lake, Azure Data Factory (ADF), Azure SQL Data Warehouse, Azure Analysis Service (AAS), Data Migration Service (DMS), Azure SQL Data, Elastic Pools, Geo Replication, Geo-restore, Azure SQL Analytics, Snowflake Cloud
ETL Tools: Azure Data Factory (ADF), Azure Database migration Service (DMS), ETL SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), BCP.
Storage: Azure Storage, Azure Blob Storage, Azure Backup, Azure Storage Disks- Premium, Azure Files.
Performance Monitoring Tools used: Azure SQL Analytics, Spotlight, IDERA, Red Gate, SQL Sentry One, SQL Server Profiler, System Monitor.
Operating System: Azure Virtual Machine (VM), Windows 10, Windows XP, Unix, LINUX. Shell Script, AWS EC2.
Scripting: Windows Power Shell, Shell Script, Azure CLI, Transact SQL, UNIX, Python Scripting
Databases: Azure SQL Database, Azure data warehouse, Azure Data azure data factory, Azure Data Synch, Elastic Pools, SQL Server 2019, SQL Server 2017, SQL Server 2016, Microsoft Azure VM, Business Intelligence (BI), Oracle 19g/11g, MYSQL, MS Access, OLAP, OLTP.
PROFESSIONAL EXPERIENCE
Confidential, Irving TX
Sr Azure Data Engineer
Responsibilities:
- Health Care Data Migration from multiple sources to Azure Data warehouse using Azure Data factory and Databricks.
- Migrate health data from client sources to Azure data warehouse using Azure Databricks and python and spark scripting.
- Developing spark applications using spark-SQL in data bricks for data extraction, transformation and aggregation from multiple file formats for analyzing and transforming the data to uncover the insights into the customer usage patterns.
- Developed Azure Function Apps as API Services to communicate to third party databases to fetch the related Patient Data.
- Design & implement migration strategies for traditional systems on Azure.
- Worked on Azure SQL Database, Azure Data Lake (ADLS), Azure Data Factory (ADF), Azure SQL DW, Azure Service Bus, Azure Key Vault, Blob storage, Azure App service.
- Interacts with Business Analysts, Users and SMEs on requirements.
- Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, Databricks, SQL Database and SQL Datawarehouse environment.
- Experience in DWH/BI project implementation using Azure DF and Databricks
- Design and implement streaming solutions using Kafka or Azure Stream Analytics
- Experience managing Data Lake Analytics, Delta Lake and how to integrate with other Azure Services.
- Knowledge of USQL, Spark and how it can be used for data transformation as part of a cloud data integration strategy.
- Identifying potential problems and recommend alternative technical solutions.
- Created and scheduled jobs for continuous streaming of data, along with hourly, Daily & weekly, as per the requirement.
- Work within and across Agile teams to design, implement, test and support technical solutions.
- Used various sources to pull data into Power BI such as Event hubs, Excel, Oracle, SQL Azure, PySpark, Python etc.
- Co-ordinated with application architects on infrastructure as a service application to Platform as a service.
- Creating Logic Apps with different triggers for integrating the data from sources to destination.
- Experience in migrating data to snowflake.
- Developed SQL queries Snow SQL
- Implement one time data migration from Azure DW to Snowflake by using Python and Snow SQL.
- Hands on experience working with CICD.
Confidential, Bellevue WA
Azure Data Engineer
Responsibilities:
- Data Migration from SAP digital to Azure Data warehouse using Azure event hubs.
- Migrate data from SAP to Azure data warehouse using EventHub’s, Azure Databricks and python scripting which is used for reporting purposes.
- Developing spark applications using spark-SQL in DataBricks for data extraction, transformation and aggregation from multiple file formats for analyzing and transforming the data to uncover the insights into the customer usage patterns.
- Extract, Transform and Load data from sources systems to Azure data storage services using a combination of ADF, T-SQL, Spark SQL, and U-SQL, Azure Data Lake Analytics.
- Data ingestion to one or more Azure services like Azure Data Lake, Azure Storage, Azure SQL, Azure DW and processing the data in Azure Databricks.
- Creating applications logic and functionality in the Azure Databricks, SQL Database and SQL Datawarehouse environment.
- Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, Databricks, SQL Database and SQL Datawarehouse environment. experience in DWH/BI project implementation using Azure DF and Databricks
- Design and implement streaming solutions using Kafka or Azure Stream Analytics
- Experience managing Azure Data Lakes (ADLS) and Data Lake Analytics, Delta Lake and an understanding of how to integrate with other Azure Services.
- Knowledge of USQL, PySpark and how it can be used for data transformation as part of a cloud data integration strategy.
- Involved in designing logical and Physical Data Model for Staging to DWH to Power BI visualizations.
- Created and scheduled jobs for continuous streaming of data, along with hourly, Daily & weekly, as per the requirement.
- Involved in POWER BI Visualizations & Dashboards as per the requirements.
- Used various sources to pull data into Power BI such as Event hubs, Excel, Oracle, SQL Azure, PySpark, Python etc.
- Co-ordinated with business users & other responsible teams to gather requirements, design, schedule & automation of pipelines.
- Creating Logic Apps with different triggers for integrating the data from sources to destination.
- Participating in Technical Architecture Documents, Project Design and Implementation Discussions
- Involved in loading and transforming large sets of unstructured data.
- Extract, Parsing, cleaning and ingest data.
- Involved is creating tables and applied HiveQL on those tables for data validations.
Confidential, New Bremen, OH
Azure Data Engineer
Responsibilities:
- Design and implement end-to-end data solutions (storage, integration, processing, visualization) in Azure.
- Implement ETL and data movement solutions using Azure Data Factory, SSIS
- Develop dashboards and visualizations to help business users analyze data as well as providing data insight to upper management with a focus on Microsoft products like SQL Server Reporting Services (SSRS) and Power BI.
- Migrate data from traditional database systems to Azure SQL databases.
- Implement ad-hoc analysis solutions using Azure Data Lake Analytics/Store, HDInsight
- Design and implement streaming solutions using Kafka or Azure Stream Analytics
- Experience managing Azure Data Lakes (ADLS) and Data Lake Analytics and an understanding of how to integrate with other Azure Services. Knowledge of USQL and how it can be used for data transformation as part of a cloud data integration strategy.
- Work with similar Microsoft on-prem data platforms, specifically SQL Server and related technologies such as SSIS, SSRS, and SSAS
- Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL Datawarehouse environment.
- Experience in DWH/BI project implementation using Azure DF
- Involved in designs Logical and Physical Data Model for Staging, DWH and Data Mart layer.
- Created POWER BI Visualizations and Dashboards as per the requirements.
- Used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure etc.
- Develop dashboards and visualizations to help business users analyze data as well as providing data insight to upper management with a focus on Microsoft products like SQL Server Reporting Services (SSRS) and Power BI.
- Engage with business users to gather requirements, design visualizations and provide training to use self-service BI tools.
- Participating in Technical Architecture Documents, Project Design and Implementation Discussions
- Azure Automation through Runbooks Creation, Migration of existing .PS1 scripts, Authorizing, Configuring, Scheduling
Confidential
Azure Data Engineer
Responsibilities:
- Following Agile (Scrum) Methodology for developing application development.
- Design & implement migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-party tools
- Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
- Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing& transforming the data to uncover insights into the customer usage patterns.
- Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure SQL DB).
- Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment. experience in DWH/BI project implementation using Azure DF
- Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
- Developed Spark applications using PySpark and Spark-SQL for data extraction, transformation and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
- Responsible for estimating the cluster size, monitoring and troubleshooting of the Spark data bricks cluster.
- Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
- Monitoring end to end integration using Azure monitor.
- Created Build and Release for multiple projects (modules) in production environment using Visual Studio Team Services (VSTS).
- Created POWER BI Visualizations and Dashboards as per the requirements.
- Used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure etc.
Confidential
MS SQL/ MSBI Developer
Responsibilities:
- Participated in analysis, design, coding, testing, implementation and documentation of the development or the change requests.
- Created data mapping and sourcing documents for various sources to New App.
- Captured Plan information for heavy, slow-running, and poorly performing T-SQL in SQL Sentry Performance advisor to providing a record of all plan changes and optimizing query based on results.
- Coordinated with the front-end design team to provide them with the necessary stored procedures and packages.
- Created Triggers to maintain the Referential Integrity.
- Responsible for developing, support and maintenance for the ETL Extract, Transform and Load processes using SQL Server 2012 integration services.
- Created complex SSIS packages using proper control and data flow elements and deployed the packages.
- Creating and scheduling SQL agent Jobs to point SSIS Packages and SQL Stored procedure.
- Extracting data from IMS/DB2, MS Access, SQL Server 2012/ 2014 for migration and performing various transformations in SSIS ETL per business rules using different type’s task and with Error Handling.
- Configured integration services 2014 and reporting services 2014 for developing new environment for NPS systems to New Application.
- Converted existing DTS packages to SSIS ETL and tune the connections pointing towards new applications.
- Creating jobs, alerts, SQL mail agent and schedule SSIS ETL packages.
- Using SSRS, creating well-formed reports and web-based reports.
- Generated daily, weekly, monthly report for analysis purpose when the end user wants to see the reports on fly using SSRS.
- Created and formatted Crosstab, Conditional, Drill-down, OLAP, Sub reports, ad-hoc reports, parameterized reports custom reports using SQL Server reporting services (SSRS).
- Designing ad-hoc reporting for different level of end user.
- Experience in administrating the created reports and assigning permission to the valid users for executing the reports.
