Machine Learning Engineer passionate about the cutting - edge technology and solving real world problems. My experience includes solid programming and domain knowledge in the Finance and Healthcare Industry.
AREAS OF EXPERTISE:
- Deep Learning
- Effective Communication
- Master Data Management
- Requirement Analysis
- Data Analysis
- Hyperparameter Tuning
- Big Data
- Design and Architecture
- Data Integration
- Quality Improvement
- Data Quality
- Data Modelling
- Data Processing
- Economics and Statistics
- Data Mining
- Healthcare and Banking Analytics
TECHNICAL SKILL SUMMARY:
Languages: C, C++, Java, SQL, HTML, XML and Python
Software: Core Java, J2ee, Web Services, IBM Info-sphere MDM Server and Informatica
Operating Systems: Windows, UNIX, Linux and Mac
Database and Server: Oracle, MySQL, DB2, SQL Server, Mongo DB, Hadoop, Greenplum, Web-Sphere Application Server
Machine Learning: Python, NumPy, SciPy, Scikit-Learn, Pandas, BeautifulSoup, Seaborn, VADER, NLTK, ELMO, Google Cloud, Matplotlib, Plotly, Keras, TensorFlow, PySpark, Kafka, Speech Recognition, Word Embedding, Regression and Classification Models.
Tools: /IDE Eclipse, Rational Software Architect, Informatica Power Center, DBeaver, SQL Developer, Aginity Workbench for Greenplum, PyCharm, Anaconda, HUE, Hive, Spyder and Jupyter.
Confidential, Jacksonville, Florida
- Gather data from various sources like MongoDB, SQL Server and Oracle for analysis.
- Integrate data being collected from various sources using python.
- Processing, cleansing, and verifying the integrity of data used for analysis
- Analysis of the data using python libraries and find the significant patterns and display them using Sea Born and Plotly.
- Worked on Natural Language Processing (NLP) to convert the Audio data to Text data.
- Worked on Natural Language Processing (NLP) to convert the Microphone/Speech Data to Text data.
- Worked on Sentimental Analysis using the technologies like Vader, NLTK and ELMO.
- Worked on Transfer Learning to use the predefined model to train on low volume of data using Deep Learning.
- Used Keras to train a model for finding the Sentimental Score of a Speech to text Data.
- Build the model using the LSTM and Recurrent Neural Networks.
Confidential, Tampa, Florida
Senior Software Engineer
- Defined the problem statement and explored the possibilities of using deep learning algorithms for solution identification in A/B simulated Dataset.
- Developed the Data Preprocessing activities like File Loading, Separation, Constant Removal, Missing Values Handling, Balance Dataset, Feature Scaling, Shuffling, Feature Extraction using Principal Component Analysis (PCA) and Batch Configuration for Gradient Descent.
- Developed Deep Neural Networks using Activation Functions ReLU, SoftMax, Tanh and Sigmoid.
- Developed Deep Neural Network using TensorFlow module in Python with 3 hidden layers and predicted maximum accuracy of 87%.
- Developed Deep Neural network using Keras module in Python with maximum of 2 hidden layers and predicted accuracy of 82%.
- Hyperparameter Tuning by using Depth and Width of Hidden Layer, Activation Functions (ReLU, Sigmoid, Tanh and SoftMax), Cross Validation, Drop Out, Early Stopping, Epoch, Batch Size and Learning Rate (1:M)
- Performed hyperparameter tuning with node optimization and activation functions (ReLU, ReLU and ReLU) and found the most accurate prediction.
- Developed the Principal Component Analysis for Dimension Reduction and to identify the key Features to be used in the model.
- Visual display of different Principal Components Variance (Individual and Cumulative) using Plotly.
- Configured single and multiple Learning rate in the model to speed up the process.
- Developed a Framework to fine tune hyperparameters by trying different combinations.
- Generated the confusion matrix based on the outcome of TensorFlow and Keras Framework.
- Configured Topics and Partitions in the Apache Kafka for Real Time and Batch Data Streaming.
- Building real-time streaming data pipelines that reliably streams online/offline data to Kafka Topic.
- Developed Java and Python code to publish and consume messages from the Kafka Topic
- Developed and Used Spark Streaming using Python to consume the message from Kafka Topics.
- Developed the Spark code using Python to consume and process the real time unstructured data (Log files) and save the processed data in .csv file to be used in Deep Learning Model.
- Developed and Used Spark SQL using Python to connect to Hadoop Hive and get the data.
- Developed the Python code for algorithms like Regression, Classification and Neural Networks
- Used Informatica tool to improve Data Quality via Cleansing, Standardization and Validation.
- Consume the Data from Hadoop Data lake using Hue/Hive and consume data from Greenplum.
- Fetch the data from multiple sources via complex SQL Queries and using Big Data sets for Neural Network
- Documentation of Data Quality and other associated processes including research and findings.
- The Project as a solution helped the Business to identify the duplicate Providers and Vendors in multiple sources by creating a Golden Record.
- Provide Technical expertise in requirements solicitations, system analysis, Architecture, Technical Design, Development and Documentation of Provider MDM Project.
- Prepares detailed Functional Specifications, Design Data Models, System Workflows from which application will be built and implemented,
- Act as a leader in the Analysis, Design, Development, Configuration, Documentation, Data Modeling, Performance Tuning and Implementation of Provider MDM Project.
- Interacts with Business Data Stewards and Business Data Governors to gather Requirements and articulate them into Technical Design Documents, Business Requirement Documents.
- Interaction with the Business Stewards and Data Governors to understand and define the Data Quality, Cleansing and Standardization rules.
- Interact with the Business Stewards to understand the issues encountered in MDM and provide the desired solutions.
- Assessment of the customer Problems, articulation and proposal of solutions.
- Prepare Design Document, Requirement Document, Database Model Design and Code Review.
- Managing an Onshore/Offshore development model.
- Design the Architecture of the Project, which includes technologies like Data-Stage, Quality-Stage, MDM, DB2 and Web-Sphere Application Server.
- Analysis, Design, Development and Implementation of a major release of the Project.
- Proposal of the solution to the MDM problems.
- Working as a Lead of the MDM project.
- Customization of SIF Parser and Maintenance Framework in MDM.
- Contribution to CMR/MDM Application for APAC, CHINA, EMEA and G2C.
- Practice of the Communication of MDM with Tibco and ODSC through JMS and RMI respectively.
- Practicing the concept of Web-sphere Clustering, Load Balancing, High Availability, Fail Over and Oracle RAC in line with MDM.
- Practicing the concept of Wide IP/ Big IP in context to Pitney Bowes Application.
- Support and Troubleshooting of the environments like DIT, SIT, UAT, STRESS, COB, PAT and Production for MDM and Pitney Bowes.
- Key Resource for all the MDM and Pitney Bowes Infrastructure, Implementation, Deployment, Production Issues, Performance Tuning for the entire APAC, CHINA and EMEA Region.
- Involved in Architecture, Design and Modeling of the Project. Trained MDM to new Resources.
- Solely responsible for MDM Installation, Configuration, Tuning, Implementation and Testing.
- Received star of the Month for successful implementation of Web Service in MDM.
- Customization and Configuration of Business Proxies, Web Services, Batch Framework and SDP.
- Involved in the analysis and design of couple of Modules with Project Lead.
- Involved in Development activities for GUI Interface, Spring Injections and DAO etc.
- Testing the Application, Bug Fixing and helping Junior Developers