We provide IT Staff Augmentation Services!

Data Analyst Resume

Peoria, IL

EXPERIENCE SUMMARY:

  • Over 8+ Years of experience in software Engineering with 6+ years of experience in Data Analysis in various sector like Healthcare, Supply Chain Management, Insurance, Manufacturing Machinery engines .
  • Experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Manipulating, Cleansing & Processing data using Excel, Access and SQL.
  • Created Data insights MS excel by converting data into Charts.
  • Good at using statistics for calculating mean, median, mode using MS excel for better data understating.
  • Developed logic in creating multiple embedded SQL queries and generating scenarios as per the requirements from users.
  • Used MS access for creating database, running SQL queries, testing queries.
  • Imported Data from MS excel to MS access by validating structure of database table.
  • Created Relationship between multiple tables and created query to access data from each table and loaded data into single table.
  • Experience in Data Acquisition, Data Validation, Predictive modeling, Data Visualization.
  • Adapt in statistical programming languages like R and Python including Big Data technologies like Hadoop, Hive.
  • Experienced in Data Cleaning, Model and Building, Model Testing and Model Deployment.
  • Knowledge in SDLC using Development, Component Integration, Performance Testing, Deployment, Support Maintenance.
  • Worked in Hadoop platform using Hive and Pig. Hive is used to convert the series into MapReduce using the Hive Framework.
  • Experienced in SQL programming and creation of relational database models. Experienced in creating cutting - edge data processing algorithms to meet project demands.
  • Involved in writing the complex structured queries using views, triggers, and joins. Worke­d with packages like Matplotlib, Seaborn, and pandas in Python.
  • Connected python with Hadoop Hive and Spark and performed data analytics. Worked on Large datasets of structured, unstructured and semi-structured data.
  • Experienced in Linear Regression, Logistic Regression, Random Forest, Decision Trees, Naïve Bayes, K-Means.
  • Worked in Current Techniques and approaches in Natural Language Processing. Better Understanding of Statistical Analysis and Modeling, Algorithms and Multivariate Analysis and familiar with model selection, testing, comparison, and validations.
  • Created different charts in MS excel such as Pie chart, Bar Chart, Line Chart for better data insight.
  • Uploaded multiple data files into the staging server and tested Data validation on live websites.
  • Evaluated model performance using RMSE score, Confusion matrix, ROC, Cross validation and A/B testing to in both stimulated environment and real world.
  • Experience in improving accuracy of models by using Boosting and Bagging techniques
  • Experience in design, development, maintenance and support of Big Data Analytics using Hadoop Ecosystem components like HDFS, MapReduce, HBase, Hive, Impala and Pig.
  • Experience in writing MAPREDUCE programs in java for data cleansing and preprocessing.
  • Excellent understanding/knowledge in installation, configuration, supporting and managing Hadoop clusters using Amazon Web Services (AWS).
  • Performed Exploratory Data Analysis and also visualized data using R, Python and Mahout.
  • Performed Clustering Algorithms in segmenting clients using Social Media data.
  • Extensive knowledge and work Experience in developing Android applications.
  • Participate in daily agile meeting, weekly and monthly staff meetings and collaborate with various teams to develop and support ongoing analyses.
  • Conducted data accuracy analysis and support stakeholders for decision-making.
  • Analyzing data using advance Excel functions such as Pivot Table, Charts and Graphs.
  • Sound RDBMS concepts and extensively worked with Oracle 8i 9i 10g 11g, DB2, SQL Server 8.0 9.0 10.0 10.5 11.0 , MySQL, and MS-Access.
  • Developed interactive dashboards,Created various Ad Hoc reports for users in Tableau by connecting various data sources.
  • Created many PHP based web application which are Dynamic and generated real time data on fly.
  • Good with core PHP, CodeIgniter which is MVC based framework and WordPress which is CMS (Content Management System).
  • Done JavaScript validation where data on the client side when the form is been submitted
  • Built front end content and its styling using HTML & CSS where designed the one column layout template and Two column layout template.
  • Good understanding about Wamp server, lamp server and MS access.
  • Strong SQL, PL/SQL knowledge. Developed Complex queries for data testing purpose in SQL Developer, SQL server management studio.
  • Used Joins and sub-queries to build complex queries involving multiple tables from different databases.
  • Creation of Scenarios, scheduling jobs, troubleshooting etc. Resolving Upgrade issues to make sure daily loads run efficiently.
  • Good understanding on development of analytical and Operational reports using different Analytics Views, Chart.

TECHNICAL SKILLS:

Languages: R, SQL, Python, S Plus, Java, Android

IDE: R Studio, Jupyter, Eclipse, NetBeans

Databases: Oracle 11g, SQL Server, MS Access, MySQL.

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Impala, Sqoop, Flume, Spark MLLib, Mahout, ETL.

Operating Systems: Windows XP/7/8/10, Ubuntu, Unix.

Web Technologies: HTML, CSS, PHP, JavaScript, AJAX

Data Analytics Tools: R console, Python (numpy, pandas, scikit-learn, scipy), SPSS, Weka, TIBCO Spotfire S+.

BI and Visualization: Tableau, SSAS, SSRS, VBA Excel Macros.

PROFESSIONAL EXPERIENCE:

Data Analyst

Confidential, Peoria, IL

Responsibilities:

  • Worked with Business analysts to identify and understand requirements, tried analyzing user reviews and then discussed cleared my views on some Data analysis.
  • Defined and stated the necessary facts and data dimensions required to support the whole project
  • Created draft data models for understanding and to help Data Modeler.
  • Manipulating, cleaning & processing data using Excel, Access and SQL.
  • Performed Data Validations using SQL queries by extracting data and running queries in wamp server.
  • Responsible for loading, extracting and validation of client data.
  • Design dimensional model for warehouse data.
  • Implement the data model for development.
  • Perform the unit testing and implemented the best practices in Data Stage.
  • Working with clients to understand their data migration needs and determine any data gaps.
  • Design project plans and teams to deliver on - time, on-budget and on-scope results within risk tolerances without undue project management overhead.
  • Applying different methods to do the plotting and the data visualization using matplotlib and seaborn.
  • Using statistics methods to do the analysis, computing the means, variances, and doing the T-test, etc.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements. Systems Documentation, change control/defect analysis and updates, Implementation testing.
  • Gathered data and documenting it for further and designed Database using Erwin DATA modeler.
  • Experienced in logical and Physical Database design & development, Normalization and Data modeling using Erwin.
  • Used Ref cursors and Collections with bulk bind and bulk collect for accessing complex Data resulted from joining of large number of tables to extract data from data warehouse.
  • Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for the maximum efficiency and fast response using Oracle Hints, Explain plans. Migration of MS Access to SQL SERVER 2012.
  • Requirements gathering, analysis, Use Cases, data mapping, and workflow diagramming.
  • Data quality analysis and execution of the Data Quality Management (DQM) package.
  • Wrote SQL queries using analytical functions.
  • Documented Business Requirements, Functional Specifications, and User stories.
  • Created UML based diagrams such as Activity diagrams using MS Visio.
  • Performed data extrapolation and validation of reports for analysis and audits.
  • Created T/SQL statements (select, insert, update, delete) and stored procedures.
  • Using sensor to monitor stages of production, Sensor data analytics can warn your maintenance team about any faulty patterns on the production floor which is very helpful in Machine manufacturing domain.
  • Used Data Analytics in sales for avoid or to foresee warranty or recall issues, potentially saving significant amounts of money in Machine Manufacturing Domain.
  • Analyzed supervised and unsupervised data, developed predictive models, statistical modeling by applying machine learning algorithms.
  • Performed Data Analysis using Pandas, NumPy, Seaborn, SciPy, Matplotlib, Scikit-learn and NLTK and developed various machine learning algorithms such as linear regression, multivariate regression, naïve Bayes, K-means, KNN and random forest.
  • Re-built the existing model and increased its accuracy from 68% to % 75% using Advanced MI Algorithms.
  • Managed entire data science project life cycle and actively involved in all the phases of project life cycle including data acquisition, data cleaning, data engineering, features scaling, features engineering, statistical modeling, dimensionality reduction using Principal Component Analysis and Factor Analysis, testing and validation using ROC plot, K- fold cross validation and data visualization.
  • Performed data parsing, data manipulation and data preparation with methods including describe data contents, compute descriptive statistics of data, regex, split and combine, Remap, merge, subset, re index, melt and reshape.

Data Analyst

Confidential

Responsibilities:

  • Participated in user meetings, gathered Business requirements & specifications for the Data provided and generated insights in form of graphs using matplotlib.
  • Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • Created Star schema dimension modelling for the data mart using Visio and created dimension and fact tables based on the business requirement.
  • Analyze and gather user requirements and create necessary documentation of their data migration stakeholders throughout the lifecycle of multiple projects to ensure adherence to project schedules & budgets
  • Collaborates with data analysis, information technology, and operations team members to address software and file run errors or for efficiency requests; works closely with support groups and analysts to assist them with their research and problem recovery..
  • Implemented the complete life cycle of Information Analyzer.
  • Verifying information with customers and directing them to their next steps.
  • Work alongside clients to develop strategies for migration of their business data across platforms utilizing Microsoft SQL Server.
  • Creating Data Dictionary for the model.
  • Maintain the database and data model for any further requirements and changes.
  • Redesign and extend the data model for further enhancements.
  • Involving in writing code to extract, clean and validate Data from tables.
  • Estimate schedules for data modeling activities and complete them on time, adhering to predetermined specifications and quality standards.
  • Worked data mapping, data cleansing, program development for loads, and data verification of converted data to legacy data.
  • Skilled at implementing project management processes
  • Developed SQL Server views for generating metadata reports.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis. Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing and optimization rules.
  • Performed data analysis and data profiling using complex SQL queries on various sources systems including Oracle 10g/11g and SQL Server 2012.
  • Identified inconsistencies in data collected from different source.
  • Worked with business owners/stakeholders to assess Risk impact, provided solution to business owners.
  • Experienced in determine trends and significant data relationships Analyzing using advanced Statistical Methods.
  • Carrying out specified data processing and statistical techniques such as sampling techniques, estimation, hypothesis testing, time series, correlation and regression analysis Using R.
  • Applied various data mining techniques: Linear Regression & Logistic Regression, classification, clustering.
  • Creating risk scores based on lab testing, biometric data, claims data, patient - generated health data which in turn helped in use of predictive modelling to proactively identify patients who are at highest risk of poor health outcomes and will benefit most from intervention is one solution believed to improve risk management for providers transitioning to value-based payment.
  • Managing supply chain which plays a vital role in health industry for healthcare organizations to trim unnecessary spending and improve efficiency.
  • Predictive tools are so important for health care industry and are in high demand among hospital executives looking to reduce variation and gain more actionable insights into ordering patterns and supply utilization. used analytics tools to monitor the supply chain and make proactive , data-driven decisions about spending could save Health care domain almost $10 million per year. where descriptive and predictive analytics can support decisions to negotiate pricing, reduce the variation in supplies, and optimizing the ordering process .
  • Took personal responsibility for meeting deadlines and delivering high quality work.
  • Strived to continually improve existing methodologies, processes, and deliverable templates.
  • Actively seeks out opportunities to learn new skills and tools in the fast paced, changing world of Analytic.

Data Analyst

Confidential

Responsibilities:

  • Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into MS Access.
  • Performed code Inspection and moved the code into Production Release.
  • Documented all the Relative activities in Quality Centre and coordinated with QA team.
  • Performed Data filtering, Dissemination activities, trouble shooting of database activities, diagnosed the bugs and logged them in version control tool.
  • Project Management, Requirements Analysis Software development lifecycle, full lifecycle documentation, database
  • Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.
  • Involved in migration of the mapping from IDQ to Power center.
  • Created adhoc reports to users in Tableau by connecting various data sources.
  • Created source to target mappings for multiple source from SQL server to oracle.
  • Used excel sheet, flat files, CSV files to generated Tableau adhoc reports.
  • Performed the batch processing of data, designed the SQL scripts, control files, batch file for data loading.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Coordinated with the Business Analyst Team for requirement gathering and Allocation Process Methodology, designed the filters for processing the Data.
  • Designed and developed the Database objects (Tables, Materialized Views, Stored procedures, Indexes) , SQL statements for executing the Allocation Methodology and creating the CSV, Text files for business .
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL*Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables .
  • Developed data migration and data validation scripts from old to the new system
  • Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.
  • Created mapping documents using the Metadata extracted from the Metadata repositories .
  • Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
  • Worked in implementing the big data ecosystem components like HDFS , Sqoop , spark , spark core, Spark SQL , flume and Kafka.
  • Handled the importing the data from data sources, transformed from Hive , PIG and loaded data from HDFS . Implemented MapReduce Jobs in the hive by querying the existing data.
  • Exploring the sample data in Tableau to know the shape of the data. Worked with Apache Spark using Scala .
  • Implemented solutions to enable the report from Cassandra data . Worked with Spark eco - system using spark SQL and Scala queries on different forms like Text files and CSV files . Pulling the data from databases and loading the data into SQL server using shell scripts is automated.
  • For a relational database , worked on MongoDB , HBase(NoSQL) database. Used Sqoop to import and export the data from another database to HDFS .
  • Worked in AWS EC2 , configuring the servers for scaling and elastic load balancing . Experience in SQL , PL/SQL and database concepts and performed database operations in PL/SQL .
  • Strong experience in using python using different libraries like Pandas, NumPy and Matplotlib .
  • Worked on Multi-Stages like SDLC including Development, Component Integration, Performance Testing, Deployment, Support Maintenance .
  • Used Predictive Modelling Insurance Domain for customer development such as Upselling/Cross-selling with individually determined purchase probabilities.
  • Used Churn prediction/Prevention in Insurance domain was very useful for customer and transaction data as well as other information, able to determine which customers are likely to cancel contracts soon. Algorithms evaluate the customer s mood and detect changes in mood over the course of time, which allows conclusions to be drawn about customer satisfaction and the likelihood of churn.
  • Used Market basket analysis may be regarded as a traditional tool of data analysis in the Retail Domain . Market basket analysis helped retailers have been making a profit out of it for years.
  • Used warranty analytics which is data analytics tool used by Retail industry for warranty claims monitoring, detection of fraudulent activity, reducing costs and increasing quality.
  • Price optimization implementation in Retail Domain using Data analysis for both retailers and customers helped retail business as well as customers.
  • Designed star schema with dimensional modeling , created fact tables and dimensional tables .
  • Involved in data analysis, data discrepancy reduction in the source and target schemas.
  • Implemented one-many , many-many Entity relationships in the data modeling of Data warehouse.
Environment: Oracle 11g, Tableau, Data Warehouse, OLAP, SQL Navigator, Visual Studio 2010, SQL Developer, Erwin 4.0 .

Data Analyst

Confidential

Responsibilities:

  • Created the XML control files to upload the data into Data warehousing system .
  • The business design work involved in establishing the reporting layouts for various reports and the frequency of report generation.
  • Identifying the information needs within and across functional areas of the organization.
  • Field mapping work involved establishing relationships between the databases Tables, filter criteria, formulas etc. which is needed for the reports.
  • Involved in full Software Development Lifecycle (SDLC) .
  • Responsible for developing, implementing, and testing data migration strategy for overall project in database using SQL 2012 as platform with global resources.
  • Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
  • Developed Informatica mappings and tuning of mappings for better performance.
  • Extracted data from different flat files, MS Excel, MS Access and transformed the databased on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
  • Created different source definitions to extract data from flat files and relational tables for Data mart .
  • Used the dynamic SQL to perform some pre - and post-session task required while performing Extraction, transformation and loading .
  • Tuned the performance of queries by working intensively over indexes.
  • Created reusable mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions.
  • Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer.
  • Created complex mappings to populate the data in the target with the required information.
  • Written SQL Scripts and PL/SQL Scripts to extract data from Database and for Testing Purposes.
  • Performed testing and QA role: Developed Test Plan, Test Scenarios and wrote SQL plus Test Scripts for execution on converted data to ensure correct ETL data transformations and controls.
  • For a relational database, worked on MongoDB, HBase(NoSQL) database. Used Sqoop to import and export the data from another database to HDFS .
  • Worked in AWS EC2 , configuring the servers for scaling and elastic load balancing. Experience in SQL, PL/SQL and database concepts and performed database operations in PL/SQL.
  • Strong experience in using python using different libraries like Pandas, NumPy and Matplotlib .
  • Worked on Multi-Stages like SDLC including Development, Component Integration, Performance Testing, Deployment, Support Maintenance.

Confidential

PHP Developer

Responsibilities:

  • Configured WordPress system to WAMP server and converted the HTML, CSS to WordPress form.
  • Experience in creating CRUD functionality from scratch and implementing with View, insert, update, Delete functionality in real time.
  • Created complex SQL queries for development of Login functionality, Registration form.
  • Worked on existing application build on CodeIgniter which is MVC based framework.
  • Used WordPress to build application from scratch and used WordPress plugins to speed up development process.
  • Created Ecommerce website in WordPress and used Woo Commerce plugin to implement ecommerce functionality.
  • Experience with working in JavaScript by implementing form validation functionality to check user's data validation.
  • Used jQuery to implement animation functionality in web-based application.
  • Used Ajax to get real time data from database without loading whole website.

Hire Now