Azure Data Engineer/power Bi Developer Resume
Bellevue, WA
SUMMARY
- Has around 11+ Years of experience in Implementing Data Warehousing, Business Intelligence Solutions using Confidential Azure Stack (Data Factory, Data Bricks, Data Lake, Azure SQL, Azure Analysis Services), Power BI, SQL Server, SSIS, SSAS (Tabular Model) etc.
TECHNICAL SKILLS
Programming: TSQL, Spark SQL, PySpark (Beginner), SQL, DAX
Databases: SQL Server, Oracle
BI Tools: ADF, ADLS, ADB, SSIS, SSAS, Power BI and SSDT, SSMS
Cloud: Confidential Azure
Source Control: Github, Team Explorer, Perforce
Other Tools: Erwin, MS Visio, Azure Devops
PROFESSIONAL EXPERIENCE
Confidential - Bellevue, WA
Azure Data Engineer/Power BI Developer
Responsibilities:
- Developed physical data models, data warehouse models and created DDL scripts to create database schema and database objects.
- Experience with Azure transformation projects and Azure architecture decision making Architect and implement ETL and data movement solutions using Azure Data Factory (ADF)
- Attending daily sync-up calls between onsite and offshore team and discuss the ongoing features/work items, issues, blockers and discuss ideas to improve the performance, readability and experience of the data presented to end users
- Using Agile methodology and deliver featured work items periodically within the agreed timelines
- Create New pipelines or modify the existing to orchestrate the data movement from On-Prem Sources to Azure Data Lake.
- Create/modify linked Services, Triggers and schedule the pipeline runs
- Monitor and debug the pipeline failures, find our root cause of the failures, and provide the fixes and enable the pipeline to run again
- Create Notebooks in ADB and implement the complex business logics and transform the data as per the requirement by using different spark-SQL
- Write pySpark code to read parquet files from Azure Data Lake to Azure Data Bricks & Load the transformed data from Azure Databricks to Azure SQL Database further to onboard them into Azure Analysis Services
- Onboard new tables or modify existing in AAS Tabular Cube, create relationships, calculated columns, measures, and perspectives using DAX Queries
- Connect AAS Tabular Cubes and design PowerBI Dashboards that ultimately represents data for end users
- Analyze and debug issues if raised by users (if find any discrepancies in report data) and fix them and move to Dev, UAT and Prod subsequently
- Maintain the repository for UAT and Prod by Check-in and check-out all the notebook changes, Pipeline Metadata and Cube solutions in Team Explorer.
- Involved in successfully migrating the complicated data warehouse overhaul project right from resource planning, analysis, design and leading the team of 10 to 15 members throughout various phases.
Environment: Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Blob Storage, PySpark, SparkSQL, Azure SQL Server, Azure Analysis Services, Power BI, SQL, TSQL, DAX, SSMS, SSDT, Team Explorer, Azure DevOps
Confidential
Senior SQL/BI Developer
Responsibilities:
- Understand the Database schema and the relationships between Fact & Dimension Tables to have a better knowledge about the project and its workflow.
- Analyze requirements and design an implementation strategy based on the user stories received from BSA/stakeholders
- Participate in daily stand-up meetings for updating status and discuss any bottle necks in the project flow
- Create various database objects like Tables, Views, Stored Procedures based on the requirement
- Create ETL Packages for loading data from Excel files, RDBMS Tables and Other sources into staging tables by applying different transformations.
- Migrate database objects from dev - stage / stage - prod and vice versa
- Provide support to various other projects like troubleshooting Procedures, checking data discrepancy etc.
- Working with Jira to maintain/update the status for all sub tasks for the User Stories along with Defects tracking
- Conduct thorough Unit Testing and Integration testing to check for data consistency before deploying it to stage / Prod
- User Perforce for code check-in and check-out
Environment: SQL Server 2012 Enterprise Edition, SQL Server Data Tools (SSIS, SSAS), Jira, Perforce, MS-Office, DAX, SSMS, SSDT
Confidential
SQL BI Developer
Responsibilities:
- Analyzing business requirements, logical data models that describes helping Fact and Dimension Tables and Relationships between these Tables.
- Participate in team meets and update the status of each work daily.
- Worked on Extracting, Transforming and Loading (ETL) data from Flat file, excel (spread sheet), MS SQL and oracle to MS SQL Server database by using SSIS services, BCP utility.
- Used SSIS and T-SQL stored procedures to transfer data from different source systems to staging area and finally transfer into data warehouse.
- Used different transformation controls like Look up, Conditional split, data conversion, un-pivot, Aggregation, Merge, Merge join, union all, Multicast, sort, OLE DB command etc.
- Used different Tasks (Data Flow, Execute SQL, Script, FTP, send mail etc.) and Transformations (Data conversion, Merge, Row count, Multicast, Sort, Lookup, and Slowly Changing Dimension (SCD) in developing the SSIS Packages.
- Developed the XML configuration files for the SSIS packages developed and created jobs in SQL Agent to load them on weekly basis.
- Extensively worked on creating Tables, Views, Stored Procedures, Indexes, Triggers, and other database objects to help faster data retrieval
- Implemented Break points, set precedence constraints and, used checkpoints for re-running the failed packages.
- Developed Stored Procedures, Functions, Tables, Views, Triggers, Cursors, User Defined Functions, other T-SQL code, and SQL joins for applications to handle large amount of data manipulation.
Environment: SQL Server Enterprise Edition, SQL Server Business Intelligence Development Studio, Confidential Office, TFS, Jira