We provide IT Staff Augmentation Services!

Azure (adf, Data Bricks, Sql) Developer Resume

5.00/5 (Submit Your Rating)

Cincinnati, OhiO

SUMMARY

  • Having around 9 years of IT experience in Data Engineering, System Analysis, Design, Development, Implementation and testing and support of Databases, Data Warehouse Applications, Data Visualization, technologies using Azure Synapse Analytics, Data Factory, Data lake, Blob, SQL DW, SQL, Power BI, SSRS, SSAS, SSIS in Life Sciences, Supply Chain, Financial and Insurance & Oil - Energy domain.
  • Experience in working predictive analytics and advanced analytics projects in developing large datasets in Cloud using Azure Data Bricks/Data Factory/Blob/Data Lake/Apps and SQL DW.
  • Extensive experience working on visualizing data based on user stories using Power BI.
  • Experienced in Confidential SQL Server BI/Data Tools to develop database, data warehouse, ETL packages, Web Based Reports and OLAP Cubes for data slicing and dicing.
  • Having experience in designing dimensional data model.
  • Skilled at Confidential Azure, software development life cycles and agile/waterfall methodologies, by researching on latest technologies and using them for Efficient increase in Organizational growth
  • Experience in Creating Power BI dashboards and reports, building data model, data flows.
  • Experience in Creating Power BI Paginated reports and exporting them into shared location using Power automate flow.
  • Experience in Azure Data Bricks with writing skills of SQL & Python.
  • Hands on experience in Synapse Analytics Workspace.
  • Experience usage of Azure Data Lake Gen1, Gen2 and Blob Storage.
  • Experience in creating Power BI Dashboards and DAX queries.
  • Worked on Confidential Power Apps, Logic Apps, Key vault, Function App.
  • Having hands on experience in Azure Stream analytics, Event Hub, Azure SQL Database & Azure Synapse.
  • Experience on Data Warehousing & BI concepts, Data Marts, Data granularity etc.
  • Experience in pipeline built such as creating linked servers, datasets and writing dynamic code in Azure Data Factory.
  • Understanding of Database query performance tuning and optimization.
  • Experience on SQL Server Migration Assistant (SSMA) to migrate Oracle databases to SQL Server/Azure SQL DB.
  • Experience in troubleshooting and resolving database problems.
  • Working with Integration Services (SSIS) package development and deployment.
  • Proven proficiency at Data Transformations like Derived Column, Conditional Split, Aggregate, Lookup and Unionall, SharePoint List Adaptors & Execute SQL Task to load data into Data Warehouse.
  • Having experience in designing & deploying different reports & dashboards like Tabular, Matrix, Chart Reports using SQL Server Reporting services (SSRS).
  • Experience working in agile methodology and planned several sprints and driven daily scrum meets.
  • Having knowledge on designing cubes (SSAS) and hands on experience in writing & debugging MDX queries.
  • Solid Conceptual Knowledge in Data Modelling Concepts like star, snowflake and Galaxy schema, Dimensions, Facts etc.
  • DTS Packages, SSIS Packages Migration from SQL server 2000/2005/2008/2008 R2 to 2012/2014/2016
  • Proficient in Sanity testing, System testing, and Regression testing.
  • Started learning in writing Python, PowerShell, and Teradata to automate the deployments.
  • Experience in Migration & deployment of Applications with upgrade version of Application & Hardware, MS build, batch script.

TECHNICAL SKILLS

Operating Systems: Windows, Linux

Cloud Technologies: Confidential Azure

Version Control Tools: GitHub, TFS, Azure DevOps Git

CI/CD Tools: Azure DevOps, ARM Templates

Databases: MS SQL Server, Azure SQL, Azure Synapse

ETL: Azure Data Factory (ADF), Azure Data Bricks, SSIS

Reporting Tools: Confidential Power BI

Analysis Services: SQL Server Analysis Services (Multi, Tabular)

Scripting: C#.Net, PowerShell, TerraForm, Python, DAX, MDX

Issue Tracking Tools: JIRA, Confluence

IDE: Visual Studio, Eclipse

PROFESSIONAL EXPERIENCE

Confidential, Cincinnati, Ohio

Azure (ADF, Data Bricks, SQL) Developer

Responsibilities:

  • Implemented a 100% Reusable (plug & play) Python Pattern (Synapse Integration, Aggregations, ChangeDataCapture, Deduplication) and High Watermark Implementation. This process will accelerate the development time and standardization across teams in the Confidential project.
  • Write Terraform code to create azure resource automatically.
  • Implemented Synapse Integration with Azure Databricks notebooks which reduce about 80% of development work. And also achieved 90% performance improvement on Synapse loading by implementing a dynamic partition switch. This process will accelerate the development time and standardization within the team in the project
  • Migrated all the existing Excel (Macros, VB) based reports to Power BI and fully automated the process around 90% reports got migration completed successfully and automated.
  • Build a dynamic process to generate the vendor specific reports, which helps business to generate around 1000+ vendor-based reports and send it to the respective users accordingly in Daily, Weekly, Monthly, Period level frequency.
  • Writing the SQL Stored Procedure to improvise the performance of the ADF Pipeline run.
  • Attending daily meetings with client and understand business logic, writing the FS and create the package for the same.
  • Facilitate Daily scrum, Sprint Planning, Sprint Demo and Sprint retro meetings.
  • Facilitate and plan for Go-live activities.
  • Track the work progress and update the status to the management on daily basis.
  • Grooming with product owner to discuss about the roadblocks and product backlog.
  • Followed agile methodology, attended daily scrum meets.

Environment: Azure Data Factory, Azure Databricks, Power BI, SSAS, SSIS, SSRS, SQL, DAX, Python, Azure Synapse, Azure DevOps, PowerShell Script, MS SQL Server.

Confidential

Azure (ADF, Data Bricks, SQL) Developer

Responsibilities:

  • Followed agile methodology, attended daily scrum meets.
  • Migrating existing ETL system to Azure Data Factory.
  • Creating linked services, input and output datasets and pipelines using activities like Copy, stored Procedure, U-Sql activities in Azure Data Factory.
  • Writing U-Sql scripts, stored procedures using U-Sql to load data from Azure Data Lake to Analytics database.
  • Monitoring data pipeline health.
  • Creating Azure tabular cube from azure data lake store file.
  • Creating power bi reports from tabular cube.
  • Creating Power BI reports using Power BI desktop.
  • Writing DAX queries for creating Measures and Calculated Columns in Power BI
  • Involved Power BI Dashboards creation and Scheduling Data Refresh with on premises Gateway.

Environment: Azure SQL Server, Power BI, ADF V1, SSAS, PowerShell Script, Windows.

Confidential

ETL Developer

Responsibilities:

  • Followed agile methodology, attended daily scrum meets.
  • Worked with business analysts on transforming business requirements into technical specifications and provided up front mock-ups for user requirements.
  • Analysis of database schema changes and implementing required code changes on database objects like Procedures, functions and views.
  • On-boarding various systems to our warehouse and providing a facility to the users to query and generate ad-hoc reports.
  • Involved in creating\ development couple of Power BI reports on OLAP system.
  • Interacting with business users and get the requirements and implementing the same.
  • Migrating existing source system data to Azure SQL.
  • Creating SSIS packages and deploy those packages in SQL Server every migration run
  • Writing the SQL Stored Procedure to improvise the performance of the SSIS Package run.
  • Involved in SSIS Performance optimization.
  • Attending daily meetings with client and understand business logic, writing the FS and create the package for the same.
  • Having experience on VSTS, to maintain version control history. Daily check in and check out

Environment: Azure SQL Server, Windows, PowerShell, T-SQL, SQL Server Integration Services (SSIS).

Confidential

SQL, ETL Developer

Responsibilities:

  • Handling the CR life cycle (Analyze, design, develop and deploy) which ranges the technology area of (T-Sql and SQL Jobs)
  • Help write and optimize in-application SQL statements
  • Ensure performance, security, and availability of databases
  • Collaborate with other team members and stakeholders and Prepare documentations and specifications
  • Build SQL Stored Procedureses based on the requirement and achieved around 400% run time efficiency.
  • Strong proficiency with SQL and its variation among popular databases
  • Experience with some of the modern relational databases
  • Skilled at optimizing large complicated SQL statements
  • Knowledge of best practices when dealing with relational databases
  • Capable of configuring popular database engines and orchestrating clusters as necessary
  • Ability to plan resource requirements from high level specifications
  • Capable of troubleshooting common database issues
  • Familiar with tools that can aid with profiling server resource usage and optimizing it.
  • Build the Informatica flows to load the data from upstream into our system.
  • Having experience on VSTS, to maintain version control history. Daily check in and check out
  • Followed agile methodology, attended daily scrum meets.
  • Prepared Octopus build and supported code deployment into various environments
  • Troubleshooting SSIS Packages and fix the Issue either Stored Procedure level/transformation level.
  • Understanding client Technical Specification Documents
  • Involved in creating and modifying packages using SSIS for migrating data from Legacy Database to CRM using Data Mapping.
  • Actively involved in the design of complex packages using various tasks and transformations according to the requirement specified.
  • Handled Mapping data load along with direct SSIS packages.
  • Prepared Unit Test Specification Requirements.
  • Maintaining entire code base on Team Foundation Server by performing check-in and checkouts frequently.
  • Involved in resolving the data load discrepancies and implementing high Performance techniques in SSIS packages.
  • Created Involved in developing ETL for loading data into staging and DWH.
  • Packages on SSIS by using different data Transformations like Derived Column, Conditional Split, Multicast, Script Component, Execute Process Task, OLEDB Command and Execute SQL Task to process Rule engine.
  • Implemented Parallel Data Processing into SSIS Packages.
  • Gac the Confidential DLL’s to our Solution and loading data to CRM with web service URL.
  • Generated the reports in the form of Tabular, Matrix and Charts.
  • Developed the Reports Drill Down, Drill Through features.
  • Analyzing and Understanding need of users and prioritizing appropriate IN creation.
  • Creating Subscriptions based on business need
  • Maintaining Jobs and Job Architectures and Troubleshooting Job/Service failures.
  • Working Build & Deployment ofnew Code drops of SQL through octopus build.
  • Fixing the issues raised while running different kinds ofJobs

Environment: T-SQL, Confidential SQL Server, SQL Server Integration Services (SSIS), C#.Net SQL Profiler, Octopus build.

We'd love your feedback!