- Above 6+ years of experience in large Unstructured data, Datasets of Structured, Data Visualization, Data Acquisition, Predictive modeling, Data Validation.
- Develop, maintain and teach new tools and methodologies related to data science and high - performance computing.
- Excellent Knowledge in Relational Data Warehouse/OLAP concepts, Database Design and methodologies.
- Experience with Big Data technologies like Hadoop and Spark would be a plus.
- Worked and extracted data from various database sources like SQL Server, Oracle and DB2.
- Experience working at Pricing and/or Revenue Management would be a plus.
- Familiarity with agile principles (e.g. Scrum), facilitating workshops and prototyping.
- Hands on experience with R-Studio for doing data preprocessing and building machine learning algorithms on different datasets.
- Good Knowledge in NoSQL databases like HBase and MongoDB. Time Series Analysis -ARIMA, Neural Networks, Sentiment Analysis, Forecasting and Text Mining.
- Cluster Analysis, Principal Component Analysis, Association Rules, Recommender Systems.
- Inferential Statistics, Hypothesis Testing, Descriptive, and Sampling.
Languages: C, C++, PL/SQL, SQL, T-SQL, XML, HTML, DHTML, HTTP, Matlab, Python
Databases: SQL Server 2017/2016/2014, MS-Access, Oracle 11g/10g/9i, Sybase 15.02 and DB2 2016.
Big Data Tools: Hadoop, Hive, Spark2.1.1, Pig, HBase, Sqoop, Flume
DWH / BI Tools: Microsoft Power BI, SSIS, SSRS, SSAS, Business Intelligence Development Studio Visual Studio, SAP Business Objects, SAP SE v 14.1(C and Informatica 6.1.
Operating Systems: Microsoft Windows 7/XP, Linux and UNIX
Data Modeling Tools: Erwin r9.6/9.5, ER/Studio, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables
Confidential, Memphis, Tennessee
- I implement models and Develop for analytical solutions.
- Develop/d esigning the approach and methodology with respect to the application and identification of the apt algorithms for a particular business problem.
- Created ETL packages using SSIS to extract data from relational database and then transform and load into the data mart.
- Transforming and merging all the weekly client data into yearly file using ETL SSIS
- Used Visual Team Foundation server for version control, source control and reporting.
- KT with the client to understand their various Data Management systems and understanding the data.
- Running SQL scripts, creating indexes, stored procedures for data analysis
- Data Lineage methodology for data mapping and maintaining data quality.
- Mapping flow of trade cycle data from source to target and documenting the same.
- Performing QA on the data extracted, transformed and exported to excel.
Environment: MS Office 2013, Microsoft reporting tools, Power BI, SSMS, SSIS 2016, Oracle 12c/11gBig Data, Hadoop 2.8.1.
Confidential, Birmingham, Alabama
- Implemented Agile Methodology for building an internal application.
- A highly immersive Data Science program involving Data Manipulation & Visualization, Web Scraping, Machine Learning, Python programming, SQL, GIT, Unix Commands, NoSQL, MongoDB, Hadoop.
- Experience in Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Flume including their installation and configuration.
- Validated the machine learning classifiers using ROC Curves and Lift Charts.
- Used Teradata15 utilities such as Fast Export, MLOAD for handling various tasks data migration/ETL from OLTP Source Systems to OLAP Target Systems
- Data transformation from various resources, data organization, features extraction from raw and stored.
- Updated Python scripts to match training data with our database stored in AWS Cloud Search, so that we would be able to assign each document a response label for further classification.
Environment: Unix, Python 3.5,, MLLib, SAS,, NoSQL, Teradata, regression, logistic regression, Hadoop 2.7OLTP, random forest, OLAP, HDFS, ODS, NLTK, SVM, JSON, XML and MapReduce.
Confidential, Bartlesville, OK
- Used DOM and SAX parsers to parse the raw XML documents
- Used RAD as Development IDE for web applications.
- Used F actory design pattern, Singleton, DAO Design Patterns based on the application requirements
- Involved in fixing bugs and minor enhancements for the front-end modules.
- Preparing and executing Unit test cases
- Implemented the project in Linux environment
- Used Log4J logging framework to write Log messages with various levels.
- Created test plan documents for all back-end database modules
- Maintenance in the testing team for System testing/Integration/UAT
Confidential, Austin, Texas
- Responsible for creating the screens with table-less designs meeting W3C standards
- Designed table-less layouts using CSS and appropriate HTML tags as per W3C standards
- Designed CSS based page layouts that are cross-browser compatible on all the major browsers like Safari, Firefox, Chrome and IE
- Used JSON and AJAX to make asynchronous calls to the project server to fetch data on the fly
- Applied jQuery scripts for basic animation and end user screen customization purposes
- Implemented Service Oriented Architecture using JMS for sending and receiving messages while creating web services
- Developed web pages applying best standards
- Worked with System Analyst and the project team to understand the requirements
Confidential, Dallas, Texas
- Involved in the System study, Analysis and designing of the project.
- Worked on developing a suitable GUI for ATM locator.
- Developed stores procedures, triggers and database tables in SQL database.
- Participated in Code review and Quality Assurance.
- Participated in daily meeting for enhancing the features for the portal.
- Preparation and review of Unit Testing, Test Results review and other quality related work.
- Used JDBC, application server provided transaction API for accessing data from the Oracle the using standard statements.