We provide IT Staff Augmentation Services!

Data Analyst Resume



Results - oriented Analyst with 9+ years of experience performing data analysis and developing cost-effective solutions. 4+ years of experience in Project management using Agile Scrum and Kanban approach. (Certified as SAFe Agilist) Proficient in Data Modelling and requirements analysis. Enthusiastic about data and focused on defining KPIs, processes, and end goals. Seeking opportunities to drive digital innovations by efficient usage of data and implementing good practices. Experience with various industries: Insurance, Hospitality, Technology. Expertise in Product management.


  • Creating Stored procedures, triggers, UDFs, SQL Joins, Performance tuning, and Query optimization using T-SQL.
  • Exploratory data analysis using Excel functions (Hlookup, VLOOKUP) and Pivot tables.
  • BI and Big data platform experience using SQL, Apache Hive
  • Good working knowledge of MSBI Architecture and its components.
  • Building CI/CD pipelines, migrating code to Azure Databricks, data to ADF (Azure Data factory)
  • Creating KPI dashboards and scorecards using Power BI, Tableau, and Matplotlib package in Python.
  • Data Modeling and Regression Analysis using Python Scikit-learn library.
  • Text processing using NLP and implementing it using NLTK package, Topic modeling using LDA
  • Data Cleansing, Data exploration using Pandas and NumPy package of Python
  • Facilitate sprint planning, manage backlog, bug triage, and prioritize features using JIRA and Azure DevOps
  • Requirements gathering, Stakeholder management, and resource allocation
  • Risk management by Qualitative and quantitative analysis and risk mitigation


Languages: Introductory R, Python, SQL, Apache Hive

Frameworks: Scikit-Learn, Matplotlib Pandas, NumPy, NLTK, Shiny

Data modeling Tools : SPSS, Excel Solver, Rapid miner, Apache Sqoop, Tableau, SAS Enterprise Miner, Power BI

Project management tools: JIRA, Azure DevOps, MS Project, HP Quality center

Methodologies: SCRUM, Kanban, SAFe Framework, SDLC Waterfall


Data Analyst

Confidential, Bellevue


  • Created analytical insights and dashboards using Power BI which enabled a 10% increase in enterprise account conversions over a quarter for a support product
  • Predicted usage of a product’s unprecedented high growth due to the COVID situation for 6 weeks with 85% accuracy using machine learning modes.
  • Predicted the Product’s usage using time series forecasting methods like BSTS, ARIMA and Facebook prophet packages
  • Facilitated migration of data using T-SQL and implemented the related machine learning models and migrated to Azure Cloud thereby leading to a reduction in server cost by 16%.
  • In Migration, I migrated the data into ADF and then, converted the Python code to Pyspark to ensure compatibility in ADB (Azure Data bricks).
  • Organized and demonstrated business features, presented actionable insights based on the feedback of internal employees and helped improve the satisfaction score (ESAT) by 8%
  • Established work relationships across multi-disciplinary teams and multiple partners in different time zones.
  • Created dashboards using Power BI for visualizing the increase in a Product’s consumption because of the COVID Crisis. The charts were across Country, Category, and Segment based on daily, weekly, and monthly users.
  • Performed Exploratory analysis for the product consumption data using Excel advanced functions like HLOOKUP and VLOOKUP to identify the trends and correlations among the variables.
  • Used NLTK to analyze the feedback data, used Text Summarization to create themes of issues from the feedback.

Environment: Jupyter Notebook, R-Studio, MS SQL Server, Microsoft Azure, Power BI, MS Excel

Technologies Used: Python, NumPy, Pandas, Matplotlib, Scikit-learn, NLTK, DAX Expressions, SQL, Shiny

Strategic Alliance & Data Analyst



  • Created dashboard using google scripts and google docs to track the performance of channel partners against pipeline and revenue targets.
  • The dashboard was created using Tableau.
  • Determined the likelihood of an opportunity being closed/won and accurate pipeline to project year-end revenue
  • Monitored, created, added docs on the SACM site to make it easy for the SACM team and other internal groups to get information regarding current partners, bookings and revenue goals,, governance, procedures, and processes.

Environment: Google Scripting, Google Analytics, Tableau

Technologies Used: Google Scripts, HTML, Python, CSS, Ajax

Senior Data Analyst



  • Created a Web Analytics application for tracking Website Customer visits thereby increasing the tracking efficiency by 15%.
  • Coordinated with Business Analysts for understanding the requirement of migration and to design the new database based on the current business needs.
  • Created a Data dictionary for 42 tables that were to be migrated and used the hybrid schema in ETL to design and migrate the data from these DB2 tables in the SQL server. To create a data dictionary, the table s were mapped initially into Excel and used VLOOKUP to identify the entity relationships.
  • Translated business ideas into requirements and design documents for 7+ big projects (More than 1000 Person days) and 20+ small projects
  • Created Hive queries to analyze the EDW data and compare them to the integrated data for marketing the new insurance product and helped increase overall revenue by 5%
  • The resultant set from Hive was shown in the dashboard which was created using Tableau.

Environment: Jupyter Notebook, Hadoop, SQL Server, DB2 development studio, Tableau

Technologies Used: Python, Hive QL, DB2 SQL, SSIS

Lead Data Analyst



  • Developed and ran different MYSQL database queries using Python MySQL connector and MySQL database packages
  • Used Apache Sqoop to import 10 million customer data in MySQL to HDFS and leading to reduction of the database cost by 15%
  • Collaborated with analysts for integrating the Customer data along with third-party systems, and fixed tracked bugs using HP Quality center
  • Created map-reduce applications for identifying various demographic metrics of the customer data.
  • Built Kanban boards in JIRA and wrote JQL scripts to automate the User Stories.
  • Use of scope analysis change control and risk control procedures, issue, and defect control procedures for the project applications.

Environment: Hadoop, SQL Server, HP Quality Center

Technologies Used: Hive QL, SQL, JQL, COBOL

Data Analyst



  • Applied master data management techniques to accommodate the high scalability of 16 million customer records for an insurance client and automation of 80% of manual business process
  • Created Entity Models using Backbase for anomaly identification and data cleansing which led to an improvement in data quality by 20%
  • Interacted with DBAs daily to understand the referential integrity constraints, Batch processes that refresh the tables overnight and to ensure the replication occurs in the same frequency after migration.

Environment: Tools: Backbase, mainframe, DB2 development studio, HP Quality Center

Technologies Used: Backbase, DB2 SQL, COBOL, JCL, PL1, IMS.




  • Involved in Design, analysis, and development of various DB2 stored procedures, triggers used to issue payments via cheque or BACS processing which led to 90% of claims being processed online.
  • Helped problem-solve for the Business team and create value delivery framework and cost-effective solutions aligned to business goals in turn leading to a 5% increase in the total value of insurance products. Received Star of the month for it
  • Developed programs, tested them, and implemented them in Production for RDR project in a tight schedule (6 months) to include UK legislation.

Environment: Tools: Backbase, mainframe, DB2 development studio, HP Quality Center

Technologies Used: Backbase, DB2 SQL, COBOL, JCL, PL1.

Hire Now