We provide IT Staff Augmentation Services!

Sr Data Analyst Resume

Eden Prairie, MN


  • 6 years of extensive experience in understanding business/process requirements and translating them to technical requirements and strong experience in Business and Data Analysis, Data Migration, Data Profiling, Data Quality, Data Conversion, Data Integration and Configuration Management and Metadata Management Services.
  • Ability to interview business stakeholders to thoroughly understand the business needs in order to create Functional Specifications (FRS), Business Requirements (BRDs), Wireframes & GUI (UX) Design, Business & Systems Process Flow Diagrams, and document Business Rules.
  • Good exposure in Informatica MDM where data Cleansing, De - duping and Address correction were performed.
  • Experience with Business Intelligence strategy, Enterprise Data Warehouse concepts and techniques.
  • Experience in creating testing strategies/ plans/scenarios for User Acceptance Testing (UAT).
  • Skilled with Data Analysis concepts of Data Modelling, Data Dictionary and Normalization, Created Data Mapping Documents to transfer data from legacy to target system and to reduce data and duplication.
  • Interacting with product owners, stakeholders to gather requirements, and prepare documentation on the same.
  • Skilled in Advanced Regression Modeling, Correlation, Multivariate Analysis, Model Building, Business Intelligence tools and application of Statistical Concepts.
  • Prevailingly driven open source tools Spyder (Python) and R Studio(R) for statistical analysis and contriving the machine learning. Involved in defining the Source to Target data mappings, Business rules, and data definitions.
  • Developed predictive models using Decision Tree, Naive Bayes, Logistic Regression, Random Forest
  • Acquainted in UML Modelling and Business process modelling and notations (BPMN) like Activity Diagrams, Sequence Diagrams, use case Diagrams, Process workFlow Diagrams, created UI/UX wireframes, Mockup screens, and prototypes in MS Visio and Mockup Screens
  • Skilled in performing data parsing, data manipulation, data architecture, data ingestion and data preparation with methods including describe data contents, compute descriptive statistics of data, regex, split and combine, merge, Remap, subset, reindex, melt and reshape.
  • Worked with AWS Cloud platform and its features which includes EC2, VPC, RDS, EBS, S3, CloudWatch, Cloud Trail, CloudFormation and Autoscaling etc.
  • Used AWS command line client and management console to interact with AWS resources and APIs.
  • Hands-on experience with MS Visio and use case diagrams for creating data flow/process flow diagrams such as flowcharts, activity charts, sequence diagrams as per UML.
  • Key strengths include analysis using BI tools, COSMOS with Scope Scripts, Reporting using Power BI, ETL using SSIS, requirement gathering, RDBMS concepts, SQL Queries and preparation of design documents.
  • User Acceptance Testing (UAT) and Manual testing (Functionality Testing) of UI and Web Applications. Extensively worked in creating Test Procedures, Test plans, Test cases and reviewing them for quality assurance.
  • Involved in System Integration Testing (SIT), Regression Testing, GUI Testing, Performance Testing & User Acceptance Testing (UAT).
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression and multivariate regression.
  • Used Amazon IAM to grant fine-grained access to AWS resources to users. Also, managed roles including permissions of users to AWS account through IAM.
  • Data Profiling to help identify patterns in the source data using SQL and Informatica and thereby help improve quality of data and help businesses to understand the converted data better to come up with accurate business rules.
  • Gathered, Analyzed and Translated systems requirements into technical specifications utilizing UML and RUP methodology. Responsible for creating different sessions and workflows to load the data to Data Warehouse using Confidential Workflow Manager.


Databases: MySQL, Oracle, Amazon Redshift, Teradata

Languages: SQL,XML,Python,Confluent Kafka

Business Skills: GAP Analysis,Impact Analysis,Stakeholder Analysis,SWOT Analysis.

Testing Tools: Postman, Amazon API gateway.

Management Tools: JIRA, Rally,Confluence

BI Tools: Tableau, AWS Quicksight, Grafana Dashboard.

Data Warehousing: Amazon S3, OLTP, OLAP, Slicing/Dicing,Amazon Redshift.

Migration & Transfer: MFT, AWS Transfer Family.

Modeling Tools: Mock-up Screens, MS visio

SDLC Methodologies: Waterfall, Agile, Agile-Scrum, Scrum.


Confidential, Eden Prairie, MN

Sr Data Analyst


  • Working with Network Measures on Provider Network Management by collecting, transforming, cleaning and modeling data with the goal of discovering the required Information.
  • Developed Sequence Diagram, Coloration Diagram and Use- Case Diagrams using UML and Rational Rose.
  • Gathered information from the business teams regarding their requirements for the systems enhancement and produced Business Requirement Documentation (BRD) based on their feedback.
  • Identified, analyze, and interpret trends or patterns in large data sets.
  • Used SQL tools to access data and interfacing with the project manager and scientist to perform data analysis
  • Created Report Mockups for SSRS based data and Sample Data Mapping documents for ETL process.
  • Generated reports as per Ad Hoc request from the client through Data extraction, Data mapping, Transformation, and loading through by using Informatica Tool.
  • Worked on data cleansing using the cleanse functions in Informatica MDM.
  • Installed Confluent Kafka, applied security to it and monitored it with Confluent control center.
  • Created Kafka Source Connectors and Sink Connectors to push data from Kafka topics to Cassandra Keyspaces.
  • Conducted and analyzed technology processes by requirement performance modeling UML and prepared essential groundwork.
  • Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
  • Reviewed status reports with the Project Manager and the involved System Analysis teams on a weekly basis.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server, AWS), then export to reporting tools.
  • Worked with AWS Cloud platform and its features which includes EC2, VPC, RDS, EBS, S3, CloudWatch, CloudTrail, CloudFormation and Autoscaling etc.
  • Prepared UAT test scenarios based on use-cases and low level and high level design.
  • Prepared daily score cards, presentations and task list to support Project Management Office (PMO).
  • Used internal SharePoint sites as Knowledge Base repositories, and Document Sharing.

Environment: SQL, MFT, AWS, Confluent Kafka,Grafana Dashboard,SSRS, ETL, BRD,Python, MS SQL, Visual Studio, Excel, SharePoint

Confidential, Atlanta, GA

Data Analyst


  • Involved in validating the Web Services related to Account, Customer and Transaction Management using the SOAP UI
  • Involved with the entire lifecycle of projects from gathering requirements to application development and maintenance.
  • Involved in executing the test cases and test scenarios for Functional Testing,Smoke and System Testing.
  • Conducted JAD sessions with obtaining domain level information, interviewing and asking detailed questions and carefully recording the requirements in a format that can be reviewed and understood by both business and technical teams.
  • Used Adobe Dynamic Tag Manager to implement third-party tags (DoubleClick, Floodlight etc.), data layers and selected site functionalities.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
  • Used knowledge of databases to design, create, modify and support databases, stored procedures, and perform query analysis of data using SQL Management Studio SSMS/SSIS and MS Access
  • Collected, organized, and analyzed data into business and financial reports for clients to use to track and improve their expense/financial decisions and business practices.
  • Created databases and programs using MS Access and Excel to generate charts, graphs and other visualizations that reflect identifiable trends, data analysis
  • Contributed to the creation of business requirements and programming specifications, by anticipating business related and data analysis issues, while recommending technical solutions.
  • Provided leadership on business intelligence initiatives by uniting efforts of all team members, and translating these efforts into data and reports that met the client's goals and needs
  • Worked independently on reporting projects from inception to completion including data ETL, manipulation, and report creation
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Developed normalized Logical and Physical database models for designing an OLTP application.
  • Performed GAP analysis to identify system restriction, potential and decommission activities.
  • Developed / Auto deploy content using AWS (Amazon Web Services), GIT/Bitbucket, Maven, Jenkins
  • Performed SQL Testing on AWS Redshift databases
  • Performing statistical data analysis and data visualization using Python and R
  • Involved in Detecting Patterns with Unsupervised Learning like K-Means Clustering.

Environment: Agile-Scrum, P&C, SDLC, PMS, Scrum, SDLC, Ms Office, UML, test plan, UAT, QA, Defect Tracking system, AWS, Teradata, SQL, SSIS


Data Analyst


  • Provided broad enterprise business system deployment analysis and support of Oracle Database.
  • Followed a structured approach to organize requirements into logical groupings of essential Business Processes, Business Rules, and Information needs that ensures any Critical Requirements are not missed.
  • Extensively used Visual Basic to create macros to make data migration testing more useful and compared results from the legacy databases to the new CRM database.
  • Followed the UML based methods using Microsoft Visio to create: Use Cases Diagrams, Activity Diagrams, State Chart Diagrams, Sequence Diagrams and Collaboration Diagrams
  • Prepared Logical Data Models that contain a set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyse and document the Business Data Requirements.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Prepared system Test Plan and scenarios for Test Scripts. Responsible for providing analytical support in the Design, Development and Implementation of Project.
  • Developed schemas for extraction, transaction, and loading (ETL) using Solonde Warehouse Workbench to expedite data integration between systems.
  • Created physical and logical models and used Erwin for Dimensional Data Modelling.
  • Created ER (Entity Relationship) Diagrams, function relationship diagrams, data flow diagrams and enforced all referential integrity constraints using ER Studio.
  • Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package.
  • Involved in the below phases of Analytics using R, Python and Jupyter notebook a. Data collection and treatment: Analysed existing internal data and external data, worked on entry errors, classification errors and defined criteria for missing values.

Environment: RUP, MS Office 2010, MS Visio, Rational Rose, Rational Clear Case, Oracle 9i, JIRA, TOAD, UML, MS Excel, Snagit, SharePoint 2007, MS Project, Windows XP.

Hire Now