We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

Mountain View, CA

SUMMARY:

  • 6 Years of experience in Analysis, Design, Development, Testing, Customization, Bug fixes, Enhancement, Support and Implementation of various web, enterprise applications using Python and C programming in various domains.
  • Experienced with full software development life - cycle (SDLC), architecting scalable platforms, object oriented programming (OOPs),database design and agile methodologies.
  • Experience in developing web-based applications using Python 2.7/3.5.
  • Good experience of software development in Python (libraries used: Beautiful Soup, numpy, scipy, matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, pycharm, jupyter notebook.
  • Extensive experience in system analysis, design, development and implementation of web based application using HTML, Angular JS, Bootstrap, CSS, JavaScript, XML, Python.
  • Experienced in MVC frameworks like Angular JS, Java Script, JQuery.
  • Experienced in web applications development using Angular.js, JQuery while using HTML/CSS/JS for server-side rendered applications.
  • Good Experience in Linux Bash scripting and following PEP Guidelines in Python.
  • Hands on design and implementation of AI,machine learning algorithms using Python and R.
  • Good experience in extracting and analyzing the very large volume of data covering a wide range of information from a user profile to transaction history using machine learning tools.
  • An excellent understanding of both traditional statistical modeling and machine learning techniques and algorithms like Regression, clustering, ensembling (random forest, gradient boosting), deep learning (neural networks), etc.
  • Very good hands-on experience working with large datasets and Deep Learning algorithms using apache spark and TensorFlow.
  • Performed data exploratory, data visualizations, and feature selections using Python and Apache Spark.
  • Highly organized and detail oriented, with a strong ability to coordinate and track multiple deliverables, tasks and dependencies.
  • Have good working experience of No SQL database like Cassandra and MongoDB.
  • Good experience in Information Extraction, NLP algorithms coupled with Deep Learning
  • Proficient in writing SQL Queries, Stored procedures, functions, tables, views, triggers on various databases like Oracle, DB2, MySQL.
  • Possessing strong analytical skills, an excellent team player with good leadership qualities and strong oral and written communication skills.
  • Strong communication, collaboration & team building skills with proficiency in grasping new technical concepts quickly.

TECHNICAL SKILLS:

Languages: Python, Perl, C, R, SQL, Spark, Java, HTML, NoSQL.

Web Technologies: HTML, CSS, Java Script, XML.

Database::Sqlite3, MySQL, Mongo DB.

Tools: SAS, Azure ML, AWS ML, MATLAB, Bioconductor, Rmarkdown, Tableau, Flask.

Python Libraries: Beautiful Soup, numpy, scipy, matplotlib, Pandas dataframe, urllib2, scikit learn.

Scripting Languages: Python, Perl, Shell scripting, Shiny.

Environment: HDFS, PIG, HIVE, map Reduce, HBase, Eclipse, Docker.

Operating System: Windows, Mac, Linux/Unix.

SDLC Methods: SCRUM, Agile

Version Controls: SVN, Github, Git, Bitbucket.

Bug Tracking tools: JIRA, Buganizer.

PROFESSIONAL EXPERIENCE:

Data Analyst

Confidential, Mountain View, CA

  • Performed data and risk analysis using log and simulation data to identify patterns/trends in thousands of scenarios.
  • Classified, reproduced, prioritized, and simulated software bugs to track autonomous vehicle software fixes and regressions from code changes.
  • Analyzed and managed large data sets and scenarios using company’s proprietary tools along with SQL and python scripts.
  • Identify, reporting, fixing, track bugs and generate weekly report for bug fix status tracking in Buganizer bug tracking system
  • Execute test cases and standardize the process of reporting test results to development team.
  • Define prioritization of module testing.
  • Reproduced and Debugged potential bugs and risk
  • Actively involved in Analysis, Development, and Unit testing of the data.
  • Lead and mentor new team members on requirements, testing strategies and product specifications.
  • Research and learn automation testing procedures and tools.
  • Experience writing complex SQL queries to validate and verify data for back-end testing
  • Formulate documents for best practices in Automation Testing, and GIT Branching.
  • Conduct audits and analysis of website traffic using Google Analytics towards each step of user journey on website and mobile apps in order to demonstrate where leads come from and optimize performance of each platform.
  • Leverage data from different sources to deliver insights with a dynamic Shiny dashboard using R to internal team.
  • Deployed R shiny applications on servers so that they can shared securely within the organization.
  • Worked closely with product managers in releasing to production in a timely manner.
  • Shared product knowledge with coworkers within multiple departments to ensure quality release and high end product development
  • Used python for Exploratory Data Analysis, A/B testing, Anova test and Hypothesis test to compare and identify the effectiveness of new collision metrics and test sets.
  • Identified risk level and eligibility of new scenarios with Machine Learning algorithms.
  • Manage the internal A/B test platform and work with product managers to define and implement all A/B tests
  • Conduct research to define new statistical approaches and create customized statistical analyses to extend the usual realm of A/B test methodologies

Data Programmer

Confidential, San Francisco, CA

  • Extracted data from HDFS and prepared data for exploratory analysis using data munging.
  • Built models using Statistical techniques like Bayesian and Machine Learning classification models like XGBoost, SVM, and Random Forest.
  • Participated in all phases of data mining, data cleaning, data collection, developing models, validation, visualization and performed Gap analysis.
  • A highly immersive Data Science program involving Data Manipulation & Visualization, Web Scraping, Machine Learning, Python programming, SQL, GIT, MongoDB, Hadoop.
  • Performed data manipulation/wrangling and develop algorithms/models using high dimensional healthcare data on custom projects.
  • Identified, analyzed, predicted and interpreted trends or patterns in complex data sets.
  • Enhanced data collection procedures to include information that is relevant for building analytic systems.
  • Setup storage and data analysis tools in AWS cloud computing infrastructure.
  • Used pandas, numpy, seaborn, matplotlib, scikit-learn, scipy for developing various machine learning algorithms.
  • Implemented Agile Methodology for building an internal application.
  • Implemented Classification using supervised algorithms like Logistic Regression, Decision trees, Naive Bayes, KNN.
  • Data transformation from various resources, data organization, features extraction from raw and stored.
  • Validated the machine learning classifiers using ROC Curves and Lift Charts.
  • Built several R shiny applications for analyzing association and clustering of several gene cohorts.
  • Used Git and Bit Bucket for R project version control, and Shiny Server to host R Shiny applications.

Data Analyst

Confidential, Houston, TX

  • SAS was used for pre-processing data, SQL queries, data analysis, generating reports, and statistical analyses.
  • Developed and implemented data collection systems and other strategies that optimize statistical efficiency and data quality.
  • Analyzed and interpreted trends or patterns by performing data analysis (data mining) on complex large data sets to generate meaningful recommendations.
  • Performed SAS data manipulation and analysis programming
  • Worked with Statistician and to assure results are consistent with expectations, and Quality control procedures were followed
  • Performed regular data checks as required to ensure validity, integrity and correctness of data.
  • Modified/developed SAS codes for data cleaning and reporting.
  • Usage of PROC SQL concepts like Indexes, Views, Joins and Sub-queries.
  • Usage of SAS ODS for posting results in required formats like CSV, RTF and HTML.
  • Modified SAS code using SAS/ Base and SAS/Macro facility.
  • Identified problems with the data, if there were any, and also produced derived data sets, tables, listings and figures.
  • Analyzed the data and produced quality customized reports by using PROC TABULATE, REPORT and SUMMARY and also provided descriptive statistics using PROC Means, Frequency, and Univariate.
  • Report generation using many SAS procedural statements, SAS/MACROS.
  • Processed data collection to ensure proper quality of data and maintained the daily error log for cleaning the data.
  • Responded to ad hoc requests.

Research Analyst

Confidential, Gainesville, FL

  • Conducted analysis of cognitive study with focus on covert spatial attention with the aid of implemented threshold algorithms.
  • Correlated multimodality data from fMRI, eye movement and pupillometry using MATLAB which processed large volume of data.
  • Enhanced pattern classification machine learning algorithms to train classifiers for great accuracy.
  • Wrote code in python for rapid analysis and automatic report generation of sensor test data.
  • Built programs in MATLAB in order to more efficiently process, quantify and analyze results
  • Built models using Statistical techniques and Machine Learning classification models like SVM, and Random Forest.
  • Participated in all phases of data mining, data cleaning, data collection, developing models, validation and vizualization.
  • Used pandas, numpy, seaborn, matplotlib, scikit-learn, scipy for developing various machine learning algorithms.
  • Implemented Classification using supervised algorithms like Logistic Regression, Decision trees, Naive Bayes, KNN.
  • Data transformation from various resources, data organization, features extraction from raw and stored.
  • Validated the machine learning classifiers using ROC Curves and Lift Charts.

Web Developer

Confidential, Gainesville, FL

  • Conceptualized, designed and maintained webpages that are aesthetically appealing.
  • Developed the complete HTML, CSS and AngularJS of the pages with emphasis on performance and accessibility.
  • The project entailed a great deal of gathering information and using that information to create a user friendly display to enable complex testing regimes in a simple fashion.
  • Primarily tasked with fixing broken code and other responsibilities included reformatting the main page with the new look and feel, and many maintenance upgrades to existing pages.
  • Researched and architected approach to implement a high performance website that met the business requirements using development best practices.
  • Communicated empathically and comfortably with team members to achieve the objectives

Verification and Design Engineer

Confidential

  • Built 56 test cases and Perl scripts to analyze and ensure the 100% functionality of various modules on the chip.
  • Participated in ASIC, FPGA design Verilog VHDL and Synopsys tools, working with Development Engineer on ASIC, FPGA simulation and verification code.
  • Enhanced the efficiency of the team through automated scripts and test cases which cut down the run-time by 30% of process time.
  • Work closely with various disciplines such as Digital, RTL, Analog groups within the company to help them with know-edge of layout requirements such as to develop timing constraints, data flow in physical preparation, Analog IP preparation and implement, DFT strategy.
  • Study clock gating methodology and its use in low power flop design, involve in design function check and performance compare with regular D flip flop.
  • Developed Functional Coverage based Models to cover all types of Coverage metrics for the modules to be verified in

    System Verilog/UVM. Sequence development for power up of PHY. Involved in writing SV assertions to implement checkers for various functionality.

  • Running regressions with different configurations and debugging the failures.
  • Developed and managed regression suites, created exclusion files and performed code coverage analysis to measure verification progress.
  • Developed UVM Callbacks for Error injection.

We'd love your feedback!