We provide IT Staff Augmentation Services!

Data Engineer / Bi Application Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 10+ experience in Data Warehousing & Business Intelligence development, including data mart, dimensional modeling, ETL, and report development
  • Masters’ Degree in Management Information Technology
  • 9 experience in financial service firm with sound analytical skills, decision making, and logical thinking
  • 8+ experience in SQL Programming including Triggers, Procedures, and functions development
  • 5+ experience in cloud computing such as GCP, AWS and Azure
  • 4+ experience of big data analytics tools such as Hadoop Hive, Spark, Python and R, Supervised and Unsupervised Learning
  • Organized with time management skills; able to multitask and prioritize tasks effectively
  • Capable of working independently or as a team member with good team leadership skills
  • Experience in Agile software development methodology
  • Passionate in learning; committed to pursuing professional development

AREAS OF EXPERTISE

  • Business Intelligence
  • Big Data Analytics
  • Consulting
  • Data Analysis
  • Data Modeling
  • Business Analysis
  • Data Warehouse Development
  • Data Integration
  • Project Management
  • Data Management
  • Business Requirements
  • Database Management
  • Data Migration
  • Consumer Insights
  • Payment Solutions

PROFESSIONAL EXPERIENCE

Confidential

Data Engineer / BI Application Developer

Responsibilities:

  • Gathers, analyzes and translates complex business rule into conceptional, logical and physical mode
  • Designed and implemented both on - premise and cloud based data analytic on Azure, GCP and AWS solutions
  • Designed and implemented Azure Data warehouse, Azure Synapse, Azure Data lake, Databricks
  • Designed and implemented Snowflake data warehouse including data pipeline, Snowpipe, user access, etc
  • Automated the provision of infrastructure on Azure using infrastructure as a code (ARM template)
  • Designed and implemented Azure Data Factory to load data from on-promise / Cloud-based databases into Azure Data Lake Gen 2, Azure Synapse / data warehouse.
  • Used Azure Migration Assistant and migrating on-premises database to Azure platform
  • Extract, transforms and creates data marts for the BI solution using Azure Data Factory, SSIS, Talend, Matillion, AWS Glue and informatica cloud as ETL tool for Redshift, Snowflake and Synapse
  • Provisioning and managing of Databricks cluster including Azure blob storage mounting, Delta lakes, streaming, Python / PySpark based-ETL, and scheduling job processing
  • Designed the infrastructure requirement for the big data solution such as HDInsight
  • Automated the migrating code from Dev to UAT and Prod using CI/CD pipeline
  • Managed Tableau Server including user profiling and security management.
  • Creates QlikSense data extractor (ETL) that coverts relational and NoSQL databases into QVDs
  • Designs, develops and manages QlikSense and Tableau Apps including Dashboard, Reports, Storytelling
  • Installed and managed Qlik Sense Enterprises Server
  • Develops mashups, widgets and extensions that utilizes JavaScript libraries such JQuery, Angular, D3, etc..
  • Designs and implement dimensional data modeling utilizing star, snowflake and galaxy schemas including Azure data modeling
  • Used Power query and DAX expression in developing Power BI reports
  • Develops complex SQL queries, stored procedures
  • Performs advanced analytics by integrating QlikSense with R and Python
  • Migrated the entire Oracle databases into Qlik repository (QVD file) and automated the data flow
  • Designed the security model and implemented section access with data reduction on hierarchical basis
  • Used Jira and other tools for Agile Software development

Confidential

Data Engineer / Business Intelligence Specialist

Responsibilities:

  • Planned, designed, implemented and maintained Azure SQL data warehouse, Snowflake and AWS Redshift
  • Gathered, and translated business logics into logical designs
  • Designed complex data model utilizing dimensional model and associative models
  • Extract, transforms and creates data marts for the BI solution using Azure Data Factory, Matillion, AWS Glue and informatica cloud as ETL tool for Redshift, Snowflake and Azure Synapse
  • Provisioned Azure data warehouse and migrates data from on premise databases to Azure data warehouse.
  • Used Azure Data Factory to load data from Azure Data Lake into data warehouse.
  • Migrating on-premises data base to Azure platform using Azure Migration Assistant and infrastructure as a code
  • Used data streaming tool such as Event hub
  • Extract and transforms relational and NOSQL databases into QVDs including CosmoDB, CasandraDB, MongoDB
  • Provisioned and used Databricks / HDInsight to process log files
  • Automated the migrating code from Dev to UAT and Prod using CI/CD pipeline
  • Created data pipeline using Python and PySpark
  • Designed, developed and managed Power BI, Tableau, QlikView, QlikSense Apps including Dashboard, Reports, Storytelling
  • Designed the security model utilizing BI solutions
  • Created themes, extensions, mashups and embedded Qlik apps into SharePoint and websites
  • Used GCP Data Studio Business Intelligence tool to create reports
  • Developed and managed Power BI / Tableau Apps, reports, analysis, dashboards, user profiling etc.
  • Designs, builds and deploys Power BI Solutions into sites including SharePoint sites
  • Created OLAP cubes (SSAS)
  • Used Python, SQL, SnowSQL to develop standard and ad hoc reports.
  • Used Power query and DAX expression in developing Power BI reports
  • Performs advanced analytics by integrating Power BI with R and Python
  • Used Jira and other tools for Agile Software development

Confidential

Data Engineer / Business Intelligence Consultant

Responsibilities:

  • Researched, designed implemented Confidential Capital Market Qlik Sense Business Intelligence Solution
  • Gathered, analyzed and translated complex business rule into conceptional, logical and physical model
  • Provisioned Azure data warehouse and migrates data from on premise databases to Azure data warehouse.
  • Designed and implemented cloud based data warehouse solution such as Azure Data Warehouse, Redshift and BigQuery
  • Used Azure Migration Assistant and infrastructure as a code to migrating on-premises data base to Azure platform
  • Used ETL and ELT to develop a data flow that feeds the data warehouse and data lakes
  • Developed and managed Qlik Sense Apps, reports, analysis, dashboards, etc.
  • Migrated the entire relational and NOSQL databases into Qlik repository (QVD file) and automated the data flow
  • Used Python, PySpark, SQL, Qlik to extract, transform and created a data mart for the BI solution such as Tableau and Power BI
  • Used Azure Data warehouse, Hive, Presto, Dremio, Snowflake, Azure blobs, AWS Redshift, GCP BigQuery to feed BI reporting
  • Used Power query and DAX expression in developing Power BI reports
  • Designed complex data model utilizing dimensional model and associative models
  • Developed mashup, widget and embedded Qlik sense App into website
  • Converted legacy SSRS reports into QlikSense App
  • Used Jira and other tools for Agile Software development
  • Migrating code from Dev to UAT and Prod using GitHub, Nexus, Jenkin and automated RLM process for CI/CD

Confidential

Data Engineer / Business Intelligence Specialist

Responsibilities:

  • Served as Subject Matter Expert that researched, recommended and managed the implementation of the business intelligence solution
  • Designed, implemented and supported Dealer FX AWS based analytic solution
  • Designed the DFX Data warehouse and Business Intelligence Solution from end to end
  • Translated business requirements into BI solutions
  • Designed Azure Data warehouse, Azure blobs, Redshift, GCP BigQuery to feed BI reporting
  • Developed data extraction / pipeline jobs using Informatica Cloud, Talend, SSIS that loads data into Redshift, Azure Data Warehouse and GCP BigQuery
  • Created data pipeline and ELT that feeds both Azure Data Warehouse and AWS Redshift including Python
  • Provisioned Databricks for data scientist and used the cluster to processes files
  • Designed both relational and dimensional data model
  • Extracting, transforming, loading and complex data modeling into Qlik Sense, Power BI and Tableau
  • Developed and managed Power BI, Tableau, Qlik Sense Apps, reports, storytelling, analysis, dashboards, etc.
  • Used Power query and DAX expression in developing Power BI reports
  • Provided advanced data analysis such as regression, forecasting, etc
  • Develops complex queries to extract data from Azure, SQL Server, Oracle databases, Redshift and NoSQL
  • Used Jira and other tools for Agile Software development

Environment: Qlik Sense, Power BI, SQL Server 2012, SSIS, SSRS, SSAS, C#, BigQuery, Azure, MongoDB,MySQL 5.0, AWS, Redshift, Azure MS Office, Python, PySpark, Hadoop, JavaScript, Spark, Informatica Cloud, Oracle Data Modeler, Amazon S3, Jira, SVN, GitHub

Confidential

Business Intelligence Analyst/Architect

Responsibilities:

  • Supported strategic decision making through the collection, mining, analysis, interpretation and communication of information to key decision-makers
  • Designed, developed and managed NIBSS Enterprise Data Warehouse
  • Gathered business requirements and translated them into technical specifications
  • Designed and implemented ETL process that fetch data from relational and NOSQL databases and stores as QVD files
  • Designed and developed reports, dashboards, and scorecards using Power BI, SSRS, Excel, Access, Qlik Sense and SAP Business Intelligence tools such as Universe, Web Intelligence, Crystal Report
  • Used Power query and DAX expression in developing Power BI reports
  • Designed and implemented relational, dimensional and associative data models
  • Managed all data collection, extraction, transformation and load (ETL) activities using Microsoft SSIS, Talend and informatica including data profiling, data cleansing, data conversion and quality control
  • Created and enforced polices for data management
  • Developed self-service business intelligence using Power BI, SAP Webi and SSAS OLAP
  • Administered Microsoft Server SQL, 2012, MySQL 5.6, PostgreSQL, and Oracle 11g
  • Developed and documented data requirements and designs specifications in the form of data models, data mappings and quality metrics
  • Used Jira and other tools for Agile Software development

Environment: Business Object 4.0, SQL Server 2012, SSIS, SSRS, SSAS, Azure, MySQL 5.0, Oracle 11g, MongoDB, Postgresql, Crystal Report 2013, MS Office, R, Python, Visio, JavaScript, Talend, SysAid, Cloudera, MongoDB Hortonworks, Informatica, Centos Linux, Unix, C#, Python, Erwin, Oracle Data Modeler, Tableau, Power BI, Qlik Sense

We'd love your feedback!