We provide IT Staff Augmentation Services!

Data Scientist Resume

3.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • 4+ years of experience Data Analysis with compiling, analyzing, validating, modeling data sets and developing Machine Learning models including neural network models for solving the business problems
  • Worked on projects which involved Deep Learning, Machine Learning Algorithms, Natural Language Processing, statistical modeling, Data transformation, performed sentiment analytics and handled large datasets.
  • Experienced with Object Oriented Programming, Software Development Life Cycle, Database designs, agile methodologies, coding and testing of enterprise applications.
  • Experienced in developing web - based applications using Python, Django, XML, CSS, HTML, JavaScript, jQuery.
  • Experienced in LAMP (Linux, Apache, MySQL, and Python/PHP) and WAMP (Windows, Apache, MYSQL, and Python/PHP) Architecture.
  • Skilled in Developing Microservices based on Restful web service using Akka Actors and Akka-Http framework in Scala which handles high concurrency and high volume of traffic.
  • Experience in configuring and implementing various AWS components such as Elastic IPs, EBS, ElastiCache, Elastic Beanstalk, DynamoDB, Redshift and Cloud Formation.
  • Strong Development Skills on Cloud Amazon Web Service (AWS).
  • Familiar with JSON based REST Web services and Amazon Web services.
  • Strong Operational skills on Cloud Services EC2, S3, VPC, Cloud Formation, Cloud Watch, RDS, Dynamo DB, SQS, SNS, API Gateway
  • Experience in writing scripts using Gherkin Syntax in Behave framework in Python.
  • Experienced in writing Sub Queries, Stored Procedures, Triggers, Cursors, Subroutines, functions on SQL .PL/SQL and PostgreSQL database.
  • Worked on refactoring the code (Struts, JSP and JavaScript) by restructuring it for better performance.
  • Proficient in Front end development experience using Python 3.6/2.7, Django 1.7/1.8, HTML, XML, CSS, JavaScript, Bootstrap, JQuery, JSON and, Angular.js, Node.js.
  • Good knowledge of web services with protocols SOAP, REST and knowledge of server Apache Tomcat, Web logic.
  • Implemented user interface guidelines and standards throughout the development and maintenance of the website using the HTML, CSS, JavaScript and JQuery.
  • Hands on experience in SVN, Git, JIRA and Bugzilla worked in SQL databases MS SQL, Apache Cassandra, Oracle and MongoDB.
  • SQL and PL/SQL programming, developing complex code units, database triggers and using the latest features to optimize performance (Bulk Binds, Materialized views, Inline views, Global Temporary Tables).
  • Leveraging queueing architectures with Rabbit MQ for scalability, performance and building.
  • Proficient in Unit testing/ Test-driven Development (TDD), Load Testing and Integration Testing.
  • Experienced in using SVN, Eclipse, Pycharm, Pyscript, Spyder, JIRA and GIT.
  • Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.
  • Written Python Code using Ansible Python API to Automate Cloud Deployment Process.
  • Good knowledge in strategy and implementation of AWS technologies such as EC2, S3, and EBS.
  • Having experience in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Experienced in various types of testing such as Unit testing, Integration testing, User acceptance testing, Functional testing.
  • Ability to work on own initiative and as a part of team. Willingness to learn new Technologies, openness to new ideas and ability to learn very quickly.

TECHNICAL SKILLS:

Operating Systems: Unix, Linux, Windows and Mac

Programming Languages: Java, C, C++, Python 3.6,3.3,/2.7/2.4, Perl, Ruby,

Scripting Languages: CSS, Java Script, JQuery, Shell Scripting

Markup languages: HTML, XML, JSON

Analytics Tools: JMP PRO, SAS, Tableau, UCI NET, Node XL, MVC3

Python Libraries: Beautiful Soup, numpy, matplotlib, python-twitter, Pandas data frame, urllib2.

Databases: Oracle 10/11g, MySQL, SQL Server and PostgreSQL

Servers: Apache Tomcat, IBM Web sphere, RESTful web services

Tools: Intellij, PyCharm, FileZilla, PL/SQL Developer, and TOAD

Integration Tools: Jenkins and Web Builder

Version Control: GitHub and SVN

Defect Tracking: JIRA, Git, and VersionOne

Methodologies & tools: Object Oriented Programming, UML, Agile Scrum

Cloud Services: AWS, Azure

PROFESSIONAL EXPERIENCE:

Confidential, Irving, TX

Data Scientist

Responsibilities:

  • Supported the assigned project manager and lead data scientist with creating detailed project plans and assisting in developing, scheduling and tracking project timelines.
  • Conducted RFM analysis for a client and split the customer database into different RFM segments, which were then used in Direct Marketing Campaign.
  • Developed test plan for conducting Direct Marketing Campaign and analyzed the outcome of it and suggested the desired direction the client should proceed.
  • Evaluated overall marketing campaign performance and success in short/long term periods (post-campaign analysis).
  • Created campaign report by comparing test and control groups’ Key Performance Indicators (KPIs) such as Average Sales, Click-through Response Rate, Success Rate and ROI.
  • Built machine learning regression models to predict sales using python
  • Created web application using R shiny for project stakeholders.
  • Recommended solutions to improve future campaign performance
  • Built machine learning algorithms to empower sales teams in the field by identifying and prioritizing leads for Cross-Selling different products
  • Developed customer profiling and analyzed their shopping pattern across different channels.
  • Performed different segmentation methodologies and came up with Best Customer Segments and Product affinity groups for the clients’ database.
  • Created Dashboards to present critical company data in a single report.

Environment: DB2, Netezza, SQL Server, Python, R, Shiny, SPSS, Hadoop, Tableau, MS Office Suite (Word, Excel, Access and PowerPoint), VBA, PeopleSoft, HP Quality Center.

Confidential, Ashburn, VA

Data Scientist

Responsibilities:

  • Translated business challenges into math/statistical hypotheses and prepare analysis plans to highlight potential value for the business
  • Developed mechanism to run model in the past to compare and evaluate effectiveness of any changes performed in existing model building algorithm using pyspark.
  • Research and application of machine learning & natural language processing for text analytics for live fraud detection on distributed systems supporting massive rates of data transfer
  • Mined scanned pdfs to extract, process and load data in structured format.
  • Used natural language processing to analyze and find intent in texts extracted from
  • Managed and analyzed large data sets using statistical tools and techniques.
  • Worked closely with the data engineering team to integrate the Machine Learning solution in production environment
  • Worked on big data using PySpark and performed machine learning using ML, MLlib packages
  • Trained the machine learning/deep learning models in Google Cloud Platform
  • Implemented a LSTM Neural Network for forecasting revenue using keras/Tensorflow

Environment: Oracle, Cassandra, Linux, SQL, Hadoop, SPSS, Spark, Keras, TensorFlow, AWS, Pig, Hive, Windows 10, MS Excel, Tableau, VBA, R, Python, Google Cloud, Github, HP Quality Center, HP Unified Functional Testing.

Confidential, Sandiego, CA

Python Developer

Responsibilities:

  • Good experience in developing web applications and implementing Model View Control (MVC) architecture using server side applications like Django and Flask.
  • Experience in developing views and templates with Python and Django view controller and templating language to create a user-friendly website interface.
  • Developed views and templates with Python and Django view controller and templating language to create a user-friendly Website interface.
  • Migrated platform from physical server to virtual environment. Took it from development into production by wiring up a Nginx stack.
  • Consumed external APIs and wrote RESTful API using Django REST Framework and AngularJs.
  • Good working experience within the Django ORM or writing native SQL in SQLServer.
  • Developed Python Script for moving files from dropbox to Amazon server.
  • Build the pipelines, ran the tests in Jenkins and deployed the application in AWS
  • Created API Keys and Usage plans in AWS API Gateways and used the keys across the regions for using the API Gateway
  • Developed the notification service by posting the JSON request in AWS API Gateway, Validating the response in Lambda by getting the data from Dynamodb and sending the notification through AWS SNS.
  • Created AWS Secrets Manager to store the properties across the regions, retrieve the secrets and wrote a shell script to automate the process
  • Created AWS kinesis Firehouse and attached it to S3 bucket for having the log information and store that as Event data in the S3 storage bucket
  • Written Python Scripts for loading data from CSV files into the database tables.
  • Documented tasks in Confluence and related them to the JIRA tickets.
  • Extensive experience in using Python /Perl to code and design various phases for data processing pipeline
  • Experience working with version control systems such as Git and Apache SVN for maintaining a consistent state throughout the application development process.
  • Involved in database-driven web application development using a variety of frameworks such as Django on Python.
  • Excellent experience in designing and maintaining complex SQL queries and developing PL/SQL stored procedures.
  • Followed the best practice of Python such as PEP-8.
  • Developed Python scripts to execute Stored Procedures and load data from various forms of csv files into staging tables in SQL Server database.
  • Involved in back end development using Python with framework Flask
  • Developed internal project in Flask to generate report from google analytics on daily, monthly and weekly basis.

Environment: Python 3.3, AWS, Flask, Java Script, Matplotlib, HTML, RESTful API, Angular JS, JQuery, JSON, AJAX, XML, CSS, Oracle 10G, SQL, MySQL, Bootstrap, Restful Web Services, Beautiful Soup, Jenkins, GitHub, SVN, Linux, PyCharm.

We'd love your feedback!