We provide IT Staff Augmentation Services!

Data Engineer Resume

Jacksonville, FL


  • Over 9 years of experience in Data Engineering with ETL tools both on - premises and Azure cloud technologies in Data Warehousing Environment and 2 years of Business analyst experience.
  • Experience in Azure Cloud, Azure Data Factory, Azure Data Lake Storage, Azure HDInsight Big Data Technologies (Apache Hadoop and Apache Spark) and Data bricks.
  • Experience in Implementing and designing complex Data solutions workloads on Microsoft Azure.
  • Experience in migration of existing solutions on-premises systems/applications to Microsoft Azure cloud.
  • Worked on Azure suite: Azure SQL Database, Azure Data Lake (ADLS), Azure Data Factory (ADF) V2, Azure SQL Data Warehouse, Azure Cosmo DB, Azure Service Bus, Azure key Vault, Azure Analysis Service (AAS), Azure Blob Storage, Azure Search, Azure App Services, Azure data Platform Services.
  • Experience in implementing hybrid connectivity between Azure and on-premises using virtual networks, VPN and Express Route.
  • Well versed experienced in creating pipelines in Azure Cloud ADFv2 using different activities like Data bricks, Copy, filter, for each, Move &Transform etc.
  • Strong experience in ADF pipelines to move the data from source systems and transform the data in azure blob storage and created ADLA USQL jobs to transform the data in Azure.
  • Good understanding of NoSQL databases and hands-on work experience in writing applications on NoSQL databases like Cosmos DB.
  • Have Experience in designing and developing Azure stream analytics jobs to process real time data using Azure Event Hubs, Azure IoT Hub and Service Bus Queue.
  • Experience working in reading Continuous JSON data from different source system using Kafka into Databricks Delta and processing the files using Apache Structured streaming, Pyspark and creating the files in parquet format.
  • Strong experience in writing applications using Python using different libraries like Pandas, NumPy, SciPy, Matplotlib.
  • Experience in Using Apache Kafka in data processing like collecting, aggregating, moving from various source.
  • Extensively worked on Power BI reports and dashboards (Power BI desktop, Power Query, Power Map, Power View), Power BI cloud, Row level security, Natural language Processing (Power Q&A), Power Apps.
  • Strong work experience in Database Development, Data Modeling, Data Warehousing, Design and Technical Management.
  • Expertise in creating Databases, Users, Tables, Triggers, Macros, views, stored procedures, Functions, Joins and indexes in Microsoft SQL Server database.
  • Performing Data validation, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst by using SQL Server, Oracle, Teradata.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXECUTE PLAN; Collect Statistics, Hints and SQL Trace both in SQL Server as well as Oracle.


Azure Stack: Azure Data Factory (ADF), Azure Databricks, U-SQL, Azure Analysis Services (SSAS), Azure Data Lake Analytics, Azure Data Lake Store (ADLS), Azure Stream Analytics.

Business Intelligence: Power BI, SSRS, SSIS, Microsoft Flow, Azure Logic Apps, Log Analytics, Spark SQL, Python, Scala, MDX, DAX, XMLA, Power Query (M Language), Pentaho, Apache Superset.

Database Technologies: SQL Server 2008 R2/2012/2014/2017, Oracle, Polybase Azure SQL Database, Azure SQL Datawarehouse, Teradata

IDEs: SSMS, SSDT, SQL Server Business Intelligence Development Studio (BIDS), Visual Studio 2005/2008/2010 Team Edition and 2012, 2015 Professional Editions

Web/Application Servers: IIS, Tomcat, Apache, Web logic, WebSphere, JBoss


Confidential | Jacksonville, FL

Data Engineer


  • Analyze, design, and build Modern data solutions using Azure PaaS service to support visualization of data. Understand current Production state of application and determine the impact of new implementation on existing business processes.
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
  • Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
  • Good knowledge on creating the CI/CD Azure DevOps Pipeline and Deployment Automation for .net, java and UI based Web Applications.
  • Implemented various resources in Azure using Azure Portal, PowerShell on Azure Resource Manager (ARM) deployment models. Experience deploying Infrastructure as Code applications using ARM Templates (JSON).
  • Involved on migrating SQL Server databases to SQL Azure Database using SQL Azure Migration Wizard and used Python API to upload agent logs into Azure blob storage.
  • Developed Spark applications using Pyspark and Spark-SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
  • Responsible for estimating the cluster size, monitoring, and troubleshooting of the Spark databricks cluster.
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • To meet specific business requirements wrote UDF’s in Scala and Pyspark.
  • Developed JSON Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the Sql Activity.
  • Hands-on experience on developing SQL Scripts for automation purpose.
  • Configured and deployed Azure Automation Scripts for a multitude of applications utilizing the Azure stack for Compute, Web and Mobile, Blobs, Resource Groups, Azure Data Lake, HDInsight Clusters, Azure Data Factory, Azure SQL, Cloud Services, and ARM Services and utilities focusing on Automation.
  • Managed Azure Infrastructure Azure Web Roles, Worker Roles, SQL Azure, Azure Storage, Azure AD Licenses. Virtual Machine Backup and Recover from a Recovery Services Vault using Azure PowerShell and Portal.
  • Designed SSIS Packages to transfer data from flat files, Excel SQL Server using Business Intelligence Development Studio.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Performed data cleansing, enrichment, mapping tasks and automated data validation processes to ensure meaningful and accurate data was reported efficiently.
  • Creating dashboards Actuals vs Goals for Fundraising, and Organization 360 view with power BI.
  • Developed complex calculated measures using Data Analysis Expression language (DAX).
  • Communicate with management to get approvals for the release, and make sure all tickets are approved on time and ready for the implementation. Provide application support for multiple enterprise applications.

Environment: Azure Web Roles, Worker Roles, SQL Azure, Azure Storage, Azure AD, Resource Groups, GIT-2.1x/2.x, Python 3.6, Microsoft Azure, Microsoft SQL Server.

Confidential | Minneapolis, MN

Data Engineer


  • Analyzed the Business Requirement Document (BRD) to design a conceptual model, participate in all phases of SDLC process to implement enterprise data solutions.
  • Supported the Data & Insights Practice by developing a roadmap for the Data Platform & Data Engineering.
  • Involved in Agile development methodology active member in scrum meetings.
  • Experience in designing efficient and robust ETL/ELT workflows, schedulers, and event-based triggers.
  • Implemented various Azure platforms such as Azure SQL Database, Azure SQL Data Warehouse, Azure Analysis Services, HDInsight, Azure Data Lake and Data Factory.
  • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data.
  • Extracted and loaded data into Data Lake environment (MS Azure) by using Sqoop which was accessed by business users.
  • Imported and exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Hands-on experience in designing and developing high-volume REST APIs.
  • Design, develop and maintain SSIS packages to consume web services data for Analysis and Reporting of APS Decision Making System.
  • Developed data pulls from Cosmos using Scope Scripts.
  • Worked with the Microsoft COSMOS DB to generate distributed systems technology. All the datasets.
  • Created SQL scripts to migrate data from SQL server to Cosmos DB.
  • Extensively involved in writing T-SQL Stored Procedures, Functions to convert complex business logic into backend.
  • Extensive role in fixing performance issues by tuning slow running queries and run-away queries using different methods, by evaluating statics, joins, indexes and code changes.
  • Generate periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
  • Responsible for developing compelling and interactive analysis reports and dashboards for Credit Card Marketing analytics using advanced DAX.
  • Built a Tabular Model cube from scratch data.
  • Built Measures in tabular model using DAX queries and created partitions in tabular model.
  • Built KPIs in tabular based on the end user requirements, worked on Performance of SSAS tabular model cubes.
  • Designed, Developed, Tested, Published, and maintained Power BI Functional reports, Dashboards for the management teams to make healthy decisions.

Environment: Azure Databases, Power BI, SQL Azure, Azure Storage, Azure AD


Data Analyst


  • Created a lot of complex stored procedures, triggers, functions, indexes, views with T-SQL statements in SQL Server Management Studio (SSMS).
  • Involved in designing and managing schema objects such as Tables, Views, Indexes, Stored Procedures, Triggers and maintained referential integrity using SQL Server Management Studio.
  • Processed ETL to transfer data from remote data centers to local data centers using SSIS. Cleansing, messaging of the data is done on the local database.
  • Designed an ETL strategy to transfer data from source to landing, staging, and destination in the data warehouse using SSIS and DTS (Data Transformation Service).
  • Developed SSIS packages to export data from Excel/Access to SQL Server, automated all the SSIS packages and monitored errors using SQL Server Agent Job.
  • Used reporting service (SSRS) created several reports based on cube and local data warehouse.
  • Prepared reports using SSRS (SQL Server Reporting Service) to declare discrepancies between user expectations and service efforts involved in scheduling the subscription reports via the subscription report wizard.
  • Interacted with subject matter experts in understanding the business logic, implemented complex business requirements in backend using efficient stored procedures and flexible function.

Environment: MS SQL Server Management Studio 2008 R2/2012 (SSMS), SQL Server Integration Service (SSIS), SQL Server Reporting Services (SSRS), MS Visual Studio 2012, SQL Server Profiler.

Hire Now