We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Greensboro, NC

SUMMARY

  • 6+ years of experience in Data Analysis, Data Profiling, Data Integration, Migration, Data governance and Metadata Management, Master Data Management and Configuration Management.
  • Expertise in data models, database design development, data mining and segmentation techniques.
  • Strong experience in Data Analysis, Data Profiling, Data Cleansing & Quality, Data Migration, Data Integration.
  • Experienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, creating data mapping documents, writing functional specifications, queries.
  • Expert in backend testing to verify Data integrity and Data validation of client server and web based applications using SQL.
  • Experience in creating views for reporting, involves complex SQL queries with sub - queries, inline views and multi table joins, with clause and outer joins as per the functional needs.
  • Good knowledge in Teradata, Business Objects, Crystal Reports, Ad hoc reports, MS Excel.
  • Working experience in agile delivery environments and all phases of Software Development Life Cycle (SDLC).
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
  • Knowledge of statistics and experience using statistical techniques for analyzing datasets.
  • Experience with data visualization tools, such as Ggplot, Matplotlib, Seaborn, NumPy & Panda Sebraon etc.
  • Experience in R (Data Exploration, Manipulation, Summarization and Visualization & Analysis).
  • Built various statistical models to solve the complex business problems.
  • Knowledge of statistical techniques such as Descriptive Modelling, Linear & Non-linear Models, classification and Data reduction techniques, Predictive Analysis.
  • Excellent understanding of machine learning algorithms, such as Decision tree, Naive Bayes, SVM & Random forest, Cluster, PCA, KNN, ANN, CNN etc.
  • Built fuzzy matching algorithm using k-nearest neighbors to identify non-exact matching duplicates.
  • Worked on Variable reduction techniques to remove the insignificant variables and built model.
  • Commitment, Self-Confidence, Positive Attitude and able to learn New Technologies.

TECHNICAL SKILLS

Database: Teradata, Oracle, MS SQL Server, MS Access, DB2

Programming: SQL, Python & R.

Business Intelligence: Cognos, Tableau.

Data Science: SPSS, R & R-Studio.

MS-Office: Excel, Power Point, word

Statistical Analysis: Descriptive Modelling, Exploratory Data Analysis, Correlation, Data Quality, Sampling Distributions, Testing Hypothesis, ANOVA, MANOVA, Factorial and Block Designs, Regression Models, PCA, Dimensional Reduction Techniques, Forecasting and ARIMA Models, Generalized Linear Models, Survival Analysis, Optimization Techniques

Machine Learning: Naive Bayes, Decision Tree, KNN, ANN, CNN, SVM Model, Random Forest, PCA, K-Means, K-Medodis, K-Prototype, DBscan, Markov Model

PROFESSIONAL EXPERIENCE

Confidential, Greensboro, NC

Data Analyst

Responsibilities:

  • Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.
  • Utilized SQL to extract data from statewide databases for analysis.
  • Acquired data from primary or secondary data sources and maintain databases/data systems.
  • Performed Data Analysis using visualization tools such as Tableau and SharePoint to provide insights into the data.
  • Configured Azure platform offerings for web applications, business intelligence using Power BI, Azure Data Factory etc.
  • Manage and complete ad-hoc requests for data reports.
  • Performed basic data exploration and checked to identify missing values and outliers
  • Applied different types of Data Manipulation functions to clean the data.
  • Worked extensively on Claims operational controls, Policy Admin.
  • Financial Statement Closing.
  • Treasury Payments provided good insights about raw data.
  • Worked extensively on Summarization and making decisions.
  • Grouping the data based on respondence, life insurance type, Sales & Offer type.
  • Worked on classification for different segmentation on insurance type.
  • Applied cluster analysis for grouping the data and outliers detection.
  • Implemented hierarchy reports in tableau.

Environment: Teradata, Tableau, Power BI, Oracle, MS Office Tools, R & Windows..

Confidential, Fort Worth, TX

Data Analyst

Responsibilities:

  • Participated in all phases including Analysis, Design, Coding, Testing and Documentation. Gathered Requirements and performed Business Analysis.
  • Developed Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using ER/ Studio and Power Designer.
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle and SQL Server.
  • Provided inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Comforted manipulating and analyzing complex, high-volume, and high-dimensionality data from varying data sources.
  • Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
  • Facilitated meetings with the business and technical team to gather necessary analytical data requirements.
  • Involved in user support and setting up the user environment on development and production Teradata databases.
  • Used openpyxl module in python to format excel files.
  • Wrote scripts in python that would pull data from Red shift database manipulate the data as per the requirement by writing necessary conditional functions and store it in data frames.
  • Deployed naming standard to the Data Model and followed company standard for Project Documentation.
  • Involved in building the ETL architecture and Source to Target mapping to load the data into Data Warehouse in SSIS.
  • Involved in mapping spreadsheets that will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Conducted several Physical Data Model training sessions with the ETL Developers. Worked with them on day-to-day basis to resolve any questions on Physical Model.
  • Written complex SQL queries in Oracle / SQL server to test data in the data warehouse and to provide expected results to users.
  • Performed GAP analysis to identify the gap between the optimized allocation and integration of the inputs, and the current level of allocation.
  • Performed web analytics and reporting via Google Analytics.
  • Involved in Reverse engineering on existing Data model to understand the data flow and business flows.
  • Collaborated with Data modelers, ETL developers in the creating the Data Functional Design documents.

Environment: MS SQL Server 2014/2012/2008 R2, SSRS, Power BI, SSIS, SSDT Oracle 10g, Tableau, MS Excel, Windows XP, SSMS, Google Analytics, BigQuery, Python, TFS, Agile, Waterfall.

Confidential, Salt Lake, Utah

Data Analyst

Responsibilities:

  • Designed and developed tools, techniques, metrics, and dashboards for insights and data visualization.
  • Drive an understanding and adherence to the principles of data quality management including metadata, lineage, and business definitions.
  • Build and execute tools to monitor and report on data quality.
  • Performed Source to Target data analysis and data mapping.
  • Created SQL queries to validate Data transformation and ETL Loading.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Developed logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Involved in data mapping and data clean up.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Coded PL/SQL packages to perform Application Security and batch job scheduling.
  • Created reconciliation report for validating migrated data.

Environment: UNIX, Shell Scripting, XML Files, XSD, XML, ERWIN, Oracle, Teradata, Toad.

We'd love your feedback!