Data Scientist Resume
Seattle, WA
SUMMARY
- Over 8+ years of experience in Information Technology with expertise in Data Science & Analytics, Artificial Intelligence, and Machine Learning and Business analysis.
- Experience of working in an agile framework with exposure to entire SDLC. Deft in consulting on the best practices, aligning business goals with technology solutions to drive process improvements, competitive advantage, and bottom - line gains.
- As a data scientist supported various business stakeholders worldwide to support this huge healthcare giant to brainstorm, design and implement solutions to pertaining to the people analytics and reporting strategy.
- Translated vision and strategies into roadmaps and projects.
- Influenced business partner adoption of insights and solutions by demonstrating knowledge of business problems, requirements, and context.
- Developed and delivered communications for complex technical projects to non-technical audiences.
- Balanced the need for existing analytic models while also investing in a pipeline of research for future analytic opportunities.
- Lead a diverse team of data scientists, data engineers and business consultants to drive rapid change and ensured proper test and learn of data science models.
- Experience in entire SDLC and Deft in consulting on the best practices, aligning business goals with technology solutions to drive process improvements, competitive advantage, and bottom-line gains.
- Experience working with business stakeholders worldwide to support design and implement solutions to pertaining to the people analytics and reporting strategy.
- Experienced in providing data analytics support and improvements, coordinating in research activities for supporting the top management and functional teams with crucial information backup.
- End to end project planning and execution experience for delivering optimal value and implementing best technological solutions to streamline operational procedures.
- Proven abilities in translating client requirements into technical requirements and recommending complete IT solutions.
- An expert in developing predictive analysis and descriptive analytics as per Specifications.
- Highly methodical in approach and possess quick grasping abilities with keen interest to upgrade skills by acquiring knowledge.
- Understood, implemented, managed, and maintained analytical solutions and techniques independently.
- Developing Logical Data Architecture with adherence to Enterprise Architecture.
- Team player with good logical reasoning ability, coordination, and interpersonal skills
- Team builder with excellent communications, time & resource management & continuous client relationship development skills.
- Excels in providing data analytics support and improvements, coordinating in research activities for supporting the top management and functional teams with crucial information backup.
- End to end project planning and execution experience for delivering optimal value and implementing best technological solutions to streamline operational procedures.
- Proven abilities in translating client requirements into technical requirements and recommending complete IT solutions.
- An expert in developing predictive analysis and descriptive analytics as per Specifications.
- Highly methodical in approach and possess quick grasping abilities with keen interest to upgrade skills by acquiring knowledge.
- Understood, implemented, managed, and maintained analytical solutions and techniques independently.
TECHNICAL SKILLS
Programming Languages: Python, R, Java, PySpark
Libraries and Frameworks: scikit-learn, XGBoost, Keras, TensorFlow
Cloud Technologies: Amazon Web Services, Terraform
Databases: Redshift, Snowflake, PostgreSQL, MySQL
Version Control: SVN, GIT, GitLab, GitHub, Bitbucket
OS & Environment: Windows, UNIX, LINUX, MAC.
Methodologies: Agile, TDD, Waterfall.
Data Pipeline Orchestration: Docker, GitLab CI/CD, Airflow, Jenkins
Data Cleaning: Jupyter Notebook, pandas, NumPy, SciPy
Data Visualization and Statistical Modeling: Matplotlib, Looker, Tableau, PowerBI, Qlik Sense
Performance Testing: Visual Studio Team Services, HP LoadRunner & HP Performance Center
Other Tools & Technologies: Hadoop, ETL, Artificial Learning, Machine Learning, Deep Learning.
PROFESSIONAL EXPERIENCE
Confidential, Seattle, WA
Data Scientist
Responsibilities:
- Worked as part of an interdisciplinary teams including Computer Scientists and Network Engineers and interacted with internal Lines of Business and external business partners
- Connected various data locations to stage data before visualizations and provided business insights and recommendations to improve performance.
- Identified business problems and developed analytic techniques to provide solutions to external teams.
- Drove multiple technical and data-related projects of varying scopes and complexities.
- Worked with other business units within the enterprise, vendors, and executives to represent and drive organizations strategy
- Used data visualization and other data-gathering methods to either provide internal strategic direction or guide decisions for other teams and provided business insights which helped product teams make better decisions.
- Developed prototypes and products by applying knowledge of statistics, machine learning, programming, data modeling, to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries.
- Selected features, built, and optimized models using machine learning techniques such as time series methods, regression, random forests, gradient boosting, and neural networks.
- Used big data technologies on AWS and Microsoft Azure utilizing Databricks and Spark.
- Collaborated with Product Owners, Enterprise Architects, Software Development Managers, and other executives to translate complex human capital management challenges into data science projects.
- Conducted ad-hoc analysis and presented results in a clear manner.
- Created automated data and modeling pipelines and developed custom performance tracking.
- Worked closely with software engineers in an agile product development environment to deliver machine learning software features. Used azure cognitive services to convert audio files into text data
- Created containers to add the production data and pre-existed data
- Worked on MySQL commands to extract the URLs and audio contents stored in the database.
- Represented the results to stakeholders on biweekly sprint.
- Used to machine learning algorithms to analyze the user sentiments from the text data.
- Generated the outputs in CSV format for clear visual.
Environment: PostgreSQL, Oracle, Teradata, R, Python, SPSS, JMP, SAS, Tableau, matplotlib, D3.js, Hadoop, MapReduce, Pig/Hive, Spark, H2O, AWS, Google Cloud, Qlik, Splunk
Confidential, Rolling Meadows, IL
Sr. Data Scientist
Responsibilities:
- Worked with various cross functional Agile teams like Data Analysts, Data Engineers, and Data Scientists that support various business units within the organization to achieve impact
- Developed & validated supervised & unsupervised learning models, create machine learning and AI models, and use other statistical techniques to solve business problems
- Contributed to the development of a strategy on how to use the model/solution to generate impact
- Collaborated with business stakeholders to understand their challenges and develop an analytical solution that creates business impact
- Contributed to the effort for team to democratize data by consolidating, integrating, cleansing, and minimally curating operational production system data and external data for insight generation
- Performed exploratory analysis which includes understanding the key statistical metrics on data columns, determining the granularity of the data, & assessing any related merge keys for combining the data
- Developed a production, end to end solution for deployment, working closely with Data Engineers
- Contributed to the development of monitoring framework to ensure the tool is performing as intended
- Supported stakeholders after deployment to ensure impact is achieved
- Contributed to the development and maintenance of best practices
- Used fuzzy matching, text mining and data reduction
- Applied statistical and predictive modeling techniques, such as machine learning, decision trees, probability networks association rules, clustering, regression, GLMs and neural and their application to business decisions networks
- Supported the data needs of multiple teams, systems and products ensured optimal data delivery architecture is consistent throughout ongoing projects.
- Worked on Building the infrastructure required for optimal ETL/ELT pipelines to ingest data from a wide variety of data sources using Microsoft Azure technologies such as Azure Data Factory and Databricks.
- Designed solutions for data analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
- Designed analytics solutions that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Worked with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Orchestrated large, complex data sets that meet functional/non-functional business requirements.
- Designed and implemented internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
- Worked with data and analytics team to strive for greater functionality in data systems.
Environment: SQL Python, Scala, Power Shell Scripting Azure Data Factory, ADLS Gen 2, Logic Apps, Azure Functions, Databricks, Apache Spark, SQL DW Data Lake, using Microsoft Azure Data Lake.
Confidential, East Hanover, NJ
Data Science Engineer
Responsibilities:
- By using semantic and syntactic analysis for resume screening algorithm is built by adopting NLP techniques in deep learning BERT.
- We created a plan for a mitigation framework which avoids bias and a fairness policy. A transparent structure is maintained between the stakeholder and the solution by using the concept explainable AI.
- Collaborated and coordinated with Data, content and modelling teams and provided analytical assistance to various NLP datasets.
- Driven and maintained high quality processes in delivering projects in collaborative Agile team environment.
- Worked closely with various stakeholders to collect, clean, model and visualize datasets.
- Created data regression models using predictive data modelling and analyzed data mining algorithms to deliver insights and implemented action-oriented solutions to complex business problems.
- Extracted hidden value insights and enriched accuracy of the data sets using various statistical models.
- Leveraged technology and automated workflows creating modernized operational processes aligning with the team strategy.
- Established high performing globally integrated and innovative mind set in the team culture.
- Countries I have worked are - Italy, Czech Republic, Spain, UAE, Australia, USA, Germany, Singapore, and Peru.
- Mentored and guided graduate interns in their data science journey.
Environment: Power BI, Snowflake, ETL, Cloud, Python, R, or SAS, DevOps systems, GitLab.
Confidential, San Antonio, TX
Business Intelligence Developer
Responsibilities:
- Gathered user requirements, analyzed, and designed software solution based on the requirements.
- Played critical role in establishing best practices for tableau Reporting by conduct meeting with the clients for gathering reporting requirements; identify the KPI's and developing POC Dashboards.
- Worked extensively with Advance analysis Actions, Calculations, Parameters, Background images, Maps, Trend Lines, Statistics, and Log Axes. Groups, hierarchies, sets to created summary level and detailed Reports.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Created Relationships, actions, data blending, filters, parameters, hierarchies, calculated fields, sorting, groupings, live connections, and in -memory in both tableau and excel.
- Tested Dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data
- Created organized, customized analysis and visualized projects and dashboards to present to executive leadership.
- Created Complicated Calculation Based LOD (Level of detail) feature.
- Designing and developing data warehouse and Amazon Redshift BI based solutions.
- Involved in managing the application's users, groups, and integration with Active Directory
- In-depth knowledge of T-SQL, SSAS, SSRS, SSIS, OLAP, BI suite, Reporting and Analytics.
- Evaluated database performance issues and executed database optimization.
- Prepared backups using Tab admin and Tab cmd commands.
- Developed Scatter Plots and Area Maps to show states utilizing the Mobile App.
- Complied interactive dashboards in Tableau Desktop and published them to Tableau Server which allowed Story Telling behind the Mobile App usage with Quick Filters for on demand needed information with just a click of a button.
- Participated in weekly meetings, reviews, and user group meeting as well as communicating with stakeholders and business groups.
- Prepared project reports for management and assisted the project manager in the development of weekly and monthly status reports, policies, and procedures.
- Good Experience in publishing reports and dashboards to tableau server.
- Found ways to make graphs and tables visually exciting, and aesthetically pleasing, while at the same time maintaining the accuracy of the core information content that is being communicated.
Environment: Tableau (Desktop and Server), SQL Server, JIRA, Share Point, Tableau Server, SQL Server 2008/2008 R2, Netezza, Tableau Prep, SharePoint, MS Visio, spark, Agile.