We provide IT Staff Augmentation Services!

Azure/powe Bi Data Engineer Resume

Detroit, MI

SUMMARY:

  • Nine years of experience in Administering, Design complex applications with emphasis on Design & Development of Data Solutions, business intelligence reporting, ETLdevelopment, Testing, and documentation.
  • Experience working with SQL Server SSIS packages that include designing and implementing end - to-end
  • ETL packages involving data extraction from various sources like SQL Server, Excel, Oracle
  • Monitor and Create Alerts for critical KPI’s, Metrics, Data Visualization for Business Process.
  • Experience with SSIS ,P ower BI Desktop, Power BI Services, M Language ) Interactions, DAX.
  • Develop Dimensional, Hierarchies, Measures, Aggregation and adding multiple sources.
  • Experience include Performance Tuning of SQL’s by implementing the views, Clustered Column Store indexes, partitions, and aggregate tables. Develop Dimensional Model using Kimball Model and work on Source/ Target Data mapping.
  • Experience working with Azure Blob Storage, Azure Data Lake, Azure Data Factory, Azure SQL, Azure SQL Datawarehouse, Azure Analytics, Polybase, Azure HDInsight, Azure Databricks.
  • Experience include Performance Tuning of SQL’s by implementing the materialized views, indexes, partitions, and aggregate tables. Develop Dimensional Model using Kimball Model and Inmon.
  • In-depth knowledge of, SSIS,Power BI, Informatica, T-SQL, and reporting and analytics.
  • Extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using SSIS,Informatica Power Center.
  • Experience creating Variables and Expressions in SSIS in order to deploy packages from staging to production environment.
  • Experience in solving Real Time issues with Index fragmentation, DBCC checks, Query Tuning, Error and Event Handling
  • Experience in using to analyze data from multiple sources and creating reports with Interactive Dashboards using power BI.
  • Extensive knowledge on designing Reports, Scorecards, and Dashboards using Power BI
  • Development level experience in Microsoft Azure providing data movement and scheduling functionality to cloud-based technologies such as Azure Blob Storage and Azure SQL Database.
  • Involved in creating pipelines that move, transform, and analyze data from a wide variety of sources using multiple methods like the Azure Power shell utility
  • Worked on backing up and restoring the Azure Data Factory V2.

WORK EXPERIENCE:

Confidential, Detroit, MI

Azure/Powe BI Data engineer

Responsibilities:

  • Create Source to Target Mapping for the SSIS Package Development. Design ETL/SSIS packages to process data from various sources to target databases. Create SQL Server Configurations, Performance Tuning of stored procedures, SSIS Packages.
  • Develop Azure POC for Marketing Data Research
  • Developed pipelines to move the data from Azure blob storage/fileshare to Azure sql datawarehouse and blob
  • Part of the Agile Team and work on weeks sprints, daily sprint Status, sprint demo preparation and stakeholder demo and signoff.
  • Work on SQL Scripts, T-SQL Stored procedures, triggers, queries, packages to load data in SQL Server, and SQL Datawarehouse.
  • Work on package configuration to setup automated ETL load processing for one time and incremental Data Loads
  • Setup Metadata Management in SQL Server DB to manage the Full, Incremental load.
  • Created Dax Queries to generated computed columns in Power BI.
  • Used Power BI, Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports
  • Developed SSIS packages bringing data from diverse sources such as Excel, SQL server, flat files, Oracle DB for the daily load to create and maintain a centralized data warehouse.
  • Designed and configured SSIS packages to migrate the data from Oracle, legacy System using various transformations.
  • Published Power BI Reports in the required originations and Made Power BI Dashboards available in Web clients and mobile apps

Confidential, Dallas, TX

Azure/BI Developer

Responsibilities:

  • Create Source to Target Mapping for the SSIS Package Development. Design ETL/SSIS packages to process data from various sources to target databases. Create SQL Server Configurations, Performance Tuning of stored procedures, SSIS Packages and SSRS Reports.
  • Work on SQL Scripts, T-SQL Stored Procedures, Triggers, Queries, Packages to Load Data in SQL Server, Oracle and SQL Datawarehouse. Work on Package Configuration to setup automated ETL load processing for one time and incremental Data Loads
  • Work with .Net Team to establish Trigger setup for insert, update and Delete logic based on the Business Requirement. Maintain Code in TFS for Migration and Change Control.
  • Work on Python / Bash / SQL Scripts to Load data from Data Lake to Datawarehouse.
  • Create SSIS, Informatica and Talend Mappings / Jobs to build One-time, Full Load and Incremental Loads. Apply Data Fixes as per the discussion with Business.
  • Create Informatica and Talend Mappings / Jobs to build One-time, Full Load and Incremental Loads. Apply Data Fixes as per the discussion with Business.
  • Setup Metadata Management in SQL Server DB to manage the Full, Incremental load.

Confidential, Houston, TX

SSIS Developer

Responsibilities:

  • Coordinate the execution of UAT, Regression testing and resolution for issues identified.
  • Work on SQL Scripts, C# scripts in SSIS to email reports using password protected Excel files.
  • Create / Modify T-SQL Stored Procedures, Queries, Packages to Load Data in SQL Server, Oracle and SQL Datawarehouse. Work on Package Configuration to setup automated ETL load processing for one time and incremental Data Loads
  • Maintain Code in TFS for migration and change control. Part of the Agile Team in a Fast-based Delivery model to create a Data from various business groups using data federation
  • Applied transform rules data Conversion, data cleansing, data aggregationa,data Merging and Created new measures and new calculated columns for reports and dashboard analysis.
  • Create executive, analytical, Strategic and Operational Dashboard with Various KPI’s and Goals for Business Users. Use Drill down and Detailed Report to have line level items about the trend.
  • Creating and scheduling SSIS packages to pull data into SQL Server, Oracle from various data sources like SQL Server, Webservice, Flat File, Subsets, Excel Spreadsheets etc.

Confidential, St Louis, MO

ETL Developer

Responsibilities:

  • Create ETL Mappings for the Operational dashboard for various KPIs, Business Metrics, allow powerful drill down, for Detail reports to understand the data at a very detailed level.
  • Involve in complete SDLC including Requirement Specifications, Analysis, Design, Development, & Testing of BI and Data warehouse application.
  • Involved in preparing Functional Specifications, Technical Specifications, Testing Plans and other documentation as required by SDLC.
  • Developed complex queries and designed SSIS packages to load the data into warehouse.
  • Create Jobs and scheduled Packages using SQL Server Management Studio for the Daily Load.
  • Applied conditional formatting in SSRS to highlight key areas in the report data
  • Attend functional testing and user acceptance sessions and worked on the feedback provided by them.
  • Involve in complete SDLC including Requirement Specifications, Analysis, Design, Development, & Testing of BI and Data warehouse application.
  • Involved in preparing Functional Specifications, Technical Specifications, Testing Plans and other documentation as required by SDLC.
  • Create Jobs and scheduled Packages using SQL Server Management Studio for the Daily Load.
  • Attend functional testing and user acceptance sessions and worked on the feedback provided by them.

Confidential, Los Angeles, CA

ETL Developer

Responsibilities:

  • Experiencing in creating complex SSIS packages using proper control and data flow elements.
  • Creating and scheduled SSIS packages to pull data from SQL Server and exported to various data sources like Excel Spreadsheets, Flat File etc. and vice versa.
  • Involve in the Operational Data mark and reporting of Sales and Service Analytics
  • Develop DML scripts to Insert, Update and Delete data in MSSQL database tables with SQL Server Management Studio.
  • Implement Data Extraction and Loading from OLTP onto staging with SSIS Packages.
  • Implement Transactions is SSIS packages using Sequence containers and Execute SQL tasks.
  • Design Charts with lines, Bar Graphs and Columns to generate reports with cumulative totals.
  • Deliver numerous ETL performance enhancements to reduce the ETL run time.
  • Involve in the design, development and testing of the ETL processes using Informatica.
  • Conduct User Session to educate users on the reports and functionality.
  • Design the mappings, workflows. Writing Complex SQL and PL/SQL query for the Business Requirement.

Confidential, Milwaukee,WI

ETL Consultant

Responsibilities:

  • Interact with business users, gather and analyse source data for enterprise data Warehouse schemas.
  • Deliver Data / Reporting solutions in Finance, Marketing, Sales and Loyalty Functional areas.
  • Develop SSIS packages to migrate the data from flat files, Oracle, SQL Server to Target Tables.
  • Involve in gathering the ETL / BI requirements, perform gap analysis and documenting the Reports and the client reporting needs and create Functional / Technical documents.
  • Work on different sources in SSIS (XML, Flat file, Excel, OLEDB Source).
  • Develop packages to copy tables, schemas and views and to extract data from Excel and from other legacy System.
  • Implement both object level and row level security based on roles and responsibilities of end user.
  • Create Drill down reports, parameterized reports, cascading parameterized reports and drill through reports.
  • Design ETL/SSIS packages to process data from various sources to target databases.
  • Create MS SQL Server Configurations, Performance Tuning of stored procedures, SSIS Packages
  • Perform unit and system testing, troubleshooting and bug fixing in development and QA environments.
  • Create adhoc reports based on the user requirements and worked on different on-demand reports.

Confidential, Pittsburgh, PA

Informatica Devloper

  • Design physical solution, test strategy and migration schedule from QA to PROD. Involve in all phases of testing including UAT, Regression and User Acceptance.
  • All the jobs are integrated using complex Mappings and Workflows using Informatica power center designer and workflow manager.
  • Tune performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
  • Created Mapplet and used them in different Mappings. Used sorter transformation and newly changed dynamic lookup Created events and tasks in the work flows using workflow manager
  • Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Developed UNIX shell scripts for running batch jobs and scheduling them

Confidential, Las Vegas, NV

SSIS/OBIEE Developer

Responsibilities:

  • Configure OBIEE Repository (RPD), Interactive Dashboards, security implementation, Analytics Metadata objects, Web Catalog Objects
  • Create metadata repository (.rpd) as per business requirements at three layers in Oracle BI Administration Tool.
  • Interacted regularly with Business Users and gathered Report Specifications
  • Create Logical Tables, Dimensions, Facts and additional Metadata in the Business Layer.
  • Create Users, Groups by establishing a Connection to the existing LDAP Server.
  • Create Reports and Dashboards using Oracle BI Answers as per the Client Requirements
  • Create Complex mappings / Job / Package using Lookup, Aggregate, Join/Merge, Filter transformations for populating target table in Informatica and SSIS.
  • Created mapplet and used them in different Mappings. Used sorter transformation and newly changed dynamic lookup Created events and tasks in the work flows using workflow manager
  • Performed Informatica migrations, test, debug, document and maintain programs and monitor according to Client standards
  • Designed physical solution, test strategy and migration schedule from QA to PROD.

Hire Now