We provide IT Staff Augmentation Services!

Data Science Engineer Resume

0/5 (Submit Your Rating)

Windsor, CO

SUMMARY

  • Experienced Software professional with 7 years in Software Development and Data science/Machine Learning/Deep learning.
  • Experienced in handling AWS Lambda, Step functions, S3, EC2, VPC, IAM, RDS, Secrets Manager, CloudWatch, API gateway, Route53 and Azure Databricks, Data factory and blob Storage.
  • In depth knowledge of Different machine learning algorithms like Naïve Bayes, Random Forest, Boost trees, SVM, K Means, K - Nearest-Neighbors, and Multiple Linear Regression, non-linear regression, hierarchical mixed and neural network (RNN, LSTM, CNN, DNN)
  • In-depth knowledge of core computer science principles including algorithms, data structures, predictive analysis, data mining, data interpretation, data pipelines.

TECHNICAL SKILLS

Programming Skills: C#, Python, Spark, MATLAB, SQL, PL/SQL, JavaScript, Java, JSON, XML

Technology: RESTful APIs, Dynamic link Libraries, MS Service Bus, WCF, Windows Forms,Lambda, State Machines, S3, ECR, ECS, EC2 instances, IAM, RDS, Route 53, Nginx

Tools: MS Power BI, Hadoop ecosystem (Hive, Pig, HBase), Apache Kafka, Azure DataFactory, Azure Databricks, Postman, Apache Airflow, Git, Apache superset, MLFlow, AWS Sage maker, AWS Snowflake

OS: Linux, Macintosh, Windows

Python libraries: NumPy, pandas, Scikit-learn, matplotlib, TensorFlow, Keras, Camelot,definer, pyocr, OpenCV, Flask, Plotly, SciPy, Pytorch, Flask, Pytest, MySQL, boto3, PyQt

Code Repo: Git, Gitlab, VSTS, TFS, ClearCase, Bitbucket, Jira and Confluence

Cloud Platform: Google Cloud Platform, Microsoft Azure, Amazon Web Services

Database: MongoDB, SQL Server, MySQL, Postgres, SQLite, Dynamo Db

IDE: MS Visual Studio, Jupiter Notebook, Anaconda, Spyder, PyCharm, MATLAB 2020b, VSCode, Atom

ML: Natural Language Processing, Text Understanding, Classification, Pattern Recognition, Recommendation Systems, Computer Vision, Time-series forecasting

PROFESSIONAL EXPERIENCE

Confidential, Windsor, CO

Data Science Engineer

Responsibilities:

  • HVAC Mixer: Rearchitected the Code which increased the performance from 20+ hrs. to 15 minutes, using AWS Lambda, Step functions and S3 extensively and managed pipeline using Argo CD.
  • Used Elasticsearch and Kibana to query user devices, login logout and location. Hosted it on a server and connected it to PostgreSQL.
  • Web scraping API: Created API's for getting weather data for multiple weather stations across Canada and stitch, them and store them on S3.
  • ML Structural model: Created a Deep learning model that can predict boundary boxes across images and cumulative wind load on the tall building setup in various configurations using Wind tunnel data, Google Earth images, MLflow, OpenCV and PostgreSQL.
  • Created a ML model that forecasts wind load data using ARIMA and data collected from various Airport weather stations and created pipelines to feed in new data in batches.
  • Developed Key performance indicators (KPI) using Apache superset and presented it to the management and led to the execution plan.
  • Developed plots for weather interpolation using Kibana and generated weekly reports.
  • Ground Mount Calculator: Developed API services that calculates wind load on solar cells using MATLAB containers on multiple EC2 instances, increased performance from 2 hr. to 15 mins.
  • Evaluated performance of ML model using A/B feature testing Ped wind Analysis: Created a python script that transforms text input to desired formatted Output.
  • Extreme Wind Analysis: Created a GUI using PyQt, pandas and NumPy and matplotlib that enables interactive nature of graphs and plots.

Confidential, Broomfield, CO

Data Science and Machine Learning Engineer

Responsibilities:

  • Attained 95% accuracy in predicting sentences from Exempt Generators filling using NLP (Natural Language Processing, NLTK, Genism, Pyocr) and Python (text mining and topic modelling, Lemmatization).
  • Successfully deployed a web service using Python Flask framework for Data mined sentences given out by the model.
  • Gained recognition for successfully forecasting LMP prices using Prophet and CNNs with an error rate less than that of MISO Day Ahead LMP on a time-series data.
  • Created a Machine learning model that could Forecast zonal load using Multiple Deep Neural networks with hyper parameters tuning and koras call-backs for a good fit.
  • Good understanding of Statistical methodologies, frameworks, and tools.
  • Deployed Webservices and MongoDB on Docker containers and managed Kubernetes
  • Extensively used TensorFlow, Pytorch to model Machine learning algorithm in an imperative/symbolic approach
  • Extracted data from multiple data sources, transformed them using Advanced SQL queries (T-SQL)
  • Published Dashboard of LMP historical and forecast prices on MS PowerBI with visual charts and reports.
  • Monitored model parameters for multiple ML models using MLflow deployed on a server
  • Good understanding of CI/CD pipelines that was managed through Azure Devops.

Confidential, Denver CO

Research Assistant

Responsibilities:

  • Study, collect and analyse models that provides all experimental data for the faculty researcher and/or supervisor.
  • Assist in the development of an algorithm that can predict cryptocurrency prices using time series data and Machine learning.
  • Developed a tool that could predict Network attacks using Wireshark Logs and LSTM neural network.
  • Created a tool to solve Big Data Problem of storing Real Time Traffic data using mongo DB and predicting the travel time using ARIMA model.

Confidential

Software Engineer

Responsibilities:

  • Developed Restful APIs using WCF Fixed bugs every sprint on .Net Windows forms.
  • Successfully used different data stores to read, write and transform huge amount of data and performance tuned it using indexes and Stored Procedures.
  • Successfully increased the performance of data fetch by refactoring the code of Confidential integration service.
  • Implementation and maintenance of multiple software products using Agile methodology, actively participated in Sprint planning and Sprint reviews.
  • Designed microservices with Message Queues in Confidential Integration Service that resolved the problem of multiple requests getting lost due to server crash.
  • RabbitMQ was used to handle request authorization.

We'd love your feedback!