We provide IT Staff Augmentation Services!

Sas Analytics Consultant Resume

0/5 (Submit Your Rating)

Madison, WI

SUMMARY

  • Seven Plus (7+) Years of hands - on experience in Advanced Analytics, Data Mining, Predictive Modeling, Web Analytics, Statistical modeling and Social Media Mining.
  • Published Technical papers in the field of Query processing and Data Mining.
  • Experience in solving business problems pertaining to Data, Text, Information Retrieval and Pattern Recognition on voluminous data.
  • Hands on experience in tracking and reporting operational statistics, analyzing results and forecasting future trends to the management.
  • Experience in designing and implementing Recommendation Engine, Supervised and Unsupervised learning algorithms on structured and unstructured data.
  • Experience in Customer Profiling, Web journey analysis, Social Media Mining, Market Basket Analysis, Trend Analysis, Competitor Analysis, and Network Analysis.
  • Experience in enabling high level of Customer Satisfaction with reporting, analysis and compelling business recommendations.
  • Experience in model building and developing code in Python, Perl, R, SAS/EG, SAS/BASE, SAS/SQL, SAS/ODS, SAS/MACROS and STATISTICA.
  • Experience in data mapping, extraction, transformation, and validation from heterogeneous sources (Excel, CSV, Oracle, flat file, Text Format Data) to SQL Server Database using ETL Tools like Pentaho Kettle, Informatica, Python and MySQL.
  • Hands on experience in Data mining tools like WEKA, NLTK, KNIME, Rapid Miner.
  • Experience in using Relational Database’s like PostgreSQL, MySQL, Oracle, DB2.
  • Experience in analyzing data to determine primary drivers, improve the process and identify areas of improvement to working with the business stakeholders and uncover suitable solutions.
  • Experience in working with different types of data like financial, credit card, retail business, e-commerce and health care data.
  • Experience in working with distributed, large-scale and scalable databases and data marts.
  • Proficient in creating efficient Cursors, Stored Procedures, Functions, Performance tuning and Query optimization using PL/SQL following best practices and industry standards.
  • Excellent verbal and written communication skills combined with interpersonal and conflict resolution skills and possesses strong analytical, leadership skills.

TECHNICAL SKILLS

Tools: SAS/EnterpriseGuide, SAS/EnterpriseMiner, SAS/BASE, SAS/SQL, SAS/ODS, SAS/MACROS, R, STATISTICA, Pentaho Kettle, Informatica, WEKA, NLTK, KNIME, Rapid Miner, MS Office, VIM, Latex, UML, Rational Rose

Languages: SQL, PL/SQL, Python, Perl, Shell Script, HTML, XML, C, C++

Databases: PostgreSQL, MySQL, Oracle (9i/8i), SQL Server, IBM DB2

Related Skills: Text Mining, Data Mining, Machine Learning, Modeling, Association Rule Mining, Algorithms, Structured Data, Semi-Structured Data, Unstructured Data

Operating Systems: UNIX, Windows, Linux

PROFESSIONAL EXPERIENCE

Confidential, Madison, WI

SAS Analytics Consultant

Responsibilities:

  • Worked on Financial, policy center- Auto, Property, Health and Claims data using SAS.
  • Participated in requirement analysis by gathering client requirements, business needs and project objectives, via feedback sessions and client meetings, in collaboration with all stakeholders.
  • Worked on Confidential reporting and analysis requests on Underwriting issues like measuring the timeframe to resolve an issue, root cause analysis for cancellation of a policy due to underwriting issue SAS Enterprise Guide.
  • Developed solution to determine at which stage of a policy life cycle, a underwriting issue occur using SQL and SAS.
  • Worked on root cause analysis for the cancellation of policy using SAS and SQL.
  • Worked on text mining solution to find claims assigned to field claim handlers that could be worked in the office using SAS Enterprise Miner.
  • Worked on Drive Time Optimization, allocated claims adjuster drive times to zip codes serviced; identified zips where drive time exceeded one hour re-routed claims to office-based adjusters using SAS.
  • Developed different statistics for the analysis and reduction of policy Quote processing time using SAS.
  • Participated in internal training courses like “Advance Business Analytics for Research Analyst” and ”Data governance”.
  • Participated in the SAS usergroup meetings and SAS training courses which included a range of topics from “What’s new in SAS v9.4”, “Optimization of SAS code ” to “Deploying SAS code in production servers”.

Environment: Windows 7, Linux, SAS Enterprise Miner, SAS Enterprise Guide, Oracle, Mainframes, Informatica.

Confidential, Plano, TX

Analytics Consultant

Responsibilities:

  • Developed regression models for retail business data. Built Boosted Tree, Marsplines models using STATISTICA.
  • Worked on PMML models generated by STATISTICA using Data mining tools like Knime, Weka, Weka Scoring Plugin, R, Rapid Miner.
  • Provided knowledge transfer sessions on PMML models to the clients.
  • Published a technical document using Latex on “Usage of PMML models with various Open source data mining tools”.
  • Developed mappings to extract, merge, map, transform, and validate data from different data sources like Oracle, Excel, DB2 to Oracle Demantra System using Informatica and Pentaho Kettle.
  • Developed test scenarios and unit tested the mappings built in Informatica for various internal systems.
  • Worked on migration and deployment of Informatica mappings on production system (Linux).

Environment: included: PostgreSQL, Oracle, DB2, Informatica, Pentaho Kettle, STATISTICA, PMML models, Knime, Weka, Weka Scoring Plugin(PDI), R, Rapid Miner, Windows 7 and Linux.

Confidential, Durham, NC

Analytics Consultant

Responsibilities:

  • Developed Data mining solutions for marketing, and consumer market domains using Python, SAS.
  • Developed solutions using text mining in Customer Interaction Analytics. Developed code to find the add-on offers the agents were offering and whether the offer got accepted or not. This was used for training the agents. Mined the text data(chats) to provide the real time recommendations using python, Informatica, SQL and SAS.
  • Developed models to detect the drop rate in the journey of a customer. This helped in optimizing chat stages and predicting the intent of the customer using Python, SQL and Informatica.
  • Developed code to extract, merge, map, transform, and validate Data from different source Databases like MySQL, Postgresql using Informatica. Involved in Data analysis and reporting using SAS.

Environment: included: Python, SAS, SQL Server, PostgreSQL, MySQL, Informatica, Statistica, PMML models, Windows 7/XP, and Linux.

Confidential, San Jose, CA

Sr. Analytics Consultant

Responsibilities:

  • Developed text mining solutions for scoring text based interactions for quantifying operational excellence of various agents in the customer service domain using SQL and Informatica.
  • Developed natural language processing methodologies for sentiment analysis and the voice of the customer solutions using Kettle, Python, SAS.
  • Worked on query categorization of customer interactions using Data mining and machine learning algorithms.
  • Developed code using NLTK, Python in some modules and it improved the performance of the existing system.
  • Prototyped solutions for Social Media Analytics (Facebook and Twitter Data).
  • Developed enterprise Data mining solutions for the detection of different stages of customer textual interactions. Used classification algorithms in WEKA, R.
  • Aided in the invention of the patent-pending solutions in the domain of customer Data science, analytics using text mining, machine learning, and ensemble feature selection methods.
  • Developed algorithms for analysis, tracking (Report generation) of web activity to link it with customer behavior using SQL.
  • Developed algorithms for segmentation and profiling of Web visitors using SQL, Python, SAS, and WEKA. Developed SQL procedures for tracking online campaign management.
  • Developed algorithms to predict customer behavior, customer purchase patterns in Web log Data using python.
  • Developed algorithms for analysis, tracking of agent behavior. Worked on models to detect whether a given chat was transferred to other departments.
  • This analysis was used for better routing of the chats to the trained agents.
  • Designed algorithms competitor analysis for product recommendation, trend analysis, and time series analysis using SQL.
  • Designed and implemented Data mining, machine learning algorithms like clustering, classification, association rule mining, and semi-supervised learning algorithms on structured, semi-structured, and unstructured Data.
  • Provided useful actionable analytical insights for clients to enhance their customer lifecycle management.
  • Developed business rules and extracted trends from Web log Data using SQL, Python and SAS.
  • Developed Data mining solutions like classification, clustering, regression for financial, credit card (capitalOne, Bank of America), retail business Data, and e-commerce Data.
  • Worked on Data mapping, extraction, transformation, and validation of Data from heterogeneous sources (Excel, CSV, Oracle, Flat Files, Text Format Data) to SQL Server Database using ETL tools like Informatica, Python, and MySQL.
  • Wrote complex cursors, queries involving complex joins, SQL functions for application processing and high-performance reporting purposes.
  • Delivered actionable customer insights, recommendations using Data and text mining tools like WEKA, Knime.
  • Delivered reports on the data using PROC MEAN, PROC UNIVARIATE, PROC FREQ, and PROC REPORT.
  • Participated in requirement analysis by gathering client requirements, business needs, and project objectives via feedback sessions and client meetings.
  • Worked with product engineering teams to implement these models into products. Transitioned the predictive solutions to the delivery teams.

Environment: included: Python, Perl, SAS, R, SQL Server, PostgreSQL, MySQL, MS Excel, WEKA, Pentaho Kettle, Informatica, Ecllipse, Windows 7, and Linux.

Confidential, Sunnyvale, CA

Analytics Consultant

Responsibilities:

  • Designed research plans for Data gathering and analysis. Manipulated Data sets using SAS Set, Merge, and Sort procedures.
  • Built statistical models and forecasting tools using R, Python.
  • Performed Data collection, Data transformation, and Data validation.
  • Applied advanced SAS skills, including SAS/MACRO, SAS/GRAPH, SAS/BASE, and SAS/STAT to develop statistical models for predicting damage in plants (Agricultural Data).
  • Converted SAS Data sets to various file types, such as HTML, RTF, PDF, and Excel using SAS/ODS.
  • Wrote SQL queries and procedures against Oracle databases for Data analysis.
  • Developed Data validation and Data reconciliation SQL queries to confirm appropriate transformations of Data and to identify issues in source Data for Data cleanup using Informatica.
  • Analyzed and mined business Data to identify patterns and correlations among the various Data points.
  • Analyzed the Data flow and Data processing rules for a complex Data store.
  • Documented research papers, analysis reports, and presented research results with MS Word, Excel, PowerPoint, and SAS.

Environment: included: Python, SAS, R, SQL Server, Oracle, Informatica, IBM DB2, and Linux.

Confidential

Data Scientist

Responsibilities:

  • Involved in Data collection, Data transformation, and Data validation from heterogeneous sources.
  • Developed code to access the Data from multiple (PostgreSQL, Oracle and MySQL) databases.
  • Created SQL queries to regulate and maintain complex Data. Forecasted changes in climatic trends.
  • Extensively used SAS report generating procedures including: PROC REPORT, PROC SQL, PROC FREQ, and PROC MEANS.
  • Implemented various Data mining and image processing techniques in order to extract patterns from the satellite images that were obtained.
  • Implemented “Titan: a High-Performance Remote-sensing Database” paper.
  • Involved in analysis, design, and developed Use Case diagrams, Sequence diagrams and Class diagrams using Rational Rose.
  • Designed and developed the User Interfaces using HTML.

Environment: Python, SAS, HTML, SQL Server, Oracle, and Windows XP.

Confidential

Research Analyst

Responsibilities:

  • Designed and implemented the “Distributed Database Management System.”
  • Handled synchronization of Data during implementations i.e., updates and deletes over the distributed sites.
  • Implemented Remote Procedure Calls using Perl.
  • Implemented standard algorithms of distributed databases like two-phase commit and two-phase locking.
  • Maintained the system and made it fault tolerant.
  • Implemented Data Mining algorithms to mine the patterns in the e-Sagu Data set (Agriculture Dataset).
  • Applied Attribute Oriented Induction (AOI) to predict and analyze the kind of diseases that affected the crops in different seasons and at different levels of growth of the crop.
  • Worked on Hadoop system.
  • Used SAS/STAT procedures including Univariate Analysis, Regression, Anova, Graph and Plot.
  • Used SAS report generating procedures like PROC REPORT, PROC SQL, PROC FREQ, and PROC MEANS.
  • Implemented the paper “The Accelerated K-means” published in ICPR05, analyzing and comparing the results with ordinary K-means using various kinds of Data sets.

Environment: included: Python, Perl, PostgreSQL, MySQL, Latex, GNUPlots, MSOffice, and Linux.

Confidential

Junior Research Analyst

Responsibilities:

  • Focused on finding efficient algorithms to compute the inverse skyline queries.
  • Proposed two efficient algorithms to calculate the skyline points in all subspaces of the K-dimensional space.
  • Introduced the notion of inverse skyline query, proposed algorithms to compute them efficiently and further extend the work by introducing the notion of skyline membership queries and maximal skyline membership queries.
  • Developed algorithms to compute them efficiently.
  • Worked as a teaching assistant for the courses: Data Base Management Systems, Data Warehousing and Data Mining, and Distributed Database Management Systems.
  • Implemented clustering and classification methods for image retrieval along with normal image retrieval methods.

Environment: C++, Python, SQL Server, Latex, GNUPlots, MSOffice, Window XP, and Linux.

We'd love your feedback!