Developer Resume
SUMMARY:
- An IT professional with 8+ Years of progressive experience in Software Engineering, Database Engineering, Data Analytics. Excellent capability in collaboration, quick learning and adaptation.
- Experience working with Python for large and fast data processing, database programming for all data storage, retrieval and processing.
- Experience in automated processing and deployment using Python.
- Experience in transforming and warehousing data from different data sources.
- Experience in all phases of database and data - warehouse development.
- Experience in integrating data, profiling, validating and data cleansing, transformation using Python and R.
- Extensive experience handling large amounts of unstructured data, transformation and database storage.
- Experience in database development from scratch, including database designing, database object creation, migration from Oracle, MySQL and DB2 to MS SQL Server
- Hands on experience in design, management and visualization of databases using Oracle, MySQL, DB2 and SQL Server.
- Experience with Apache PySpark for large data processing
- Experience with Microsoft Azure SQL
- In-depth Understanding in NoSQL databases like MongoDB and Cassandra.
- Experience with cloud databases DynamoDB and CosmosDB.
- Very good experience and knowledge in provisioning virtual clusters on Azure and AWS cloud which includes services like EC2, S3, and EMR.
- Good knowledge on dimensional data modelling.
- Excellent Database administration (DBA) skills including user authorizations, Database creation, Tables, indexes and backup creation.
TECHNICAL SKILLS:
Languages: Python 3, C++, Java, R, Perl
Python and R: Numpy, SciPy, Pandas, PGP encryption, Python Database Connectivity, PyTest, Jupyter, Anaconda, PyCharm, Scikit-learn, Flask, Django
Cloud: Google Cloud Platform, AWS, S3, EMR, Azure, Azure SQL Database, Azure Analysis Services, Azure Data Warehouse, Redis, Cloudwatch
Web Technologies: HTML5, XML, CSS3, Web Services, WSDL, Twitter Bootstrap
Data Modelling Tools: Erwin r 9.6, 9.5, 9.1, 8.x, Rational Rose, ER/Studio, MS Visio, SAP Power designer
Big Data Technologies: Spark
SQL, Spark SQL, Databases SQL: Server, MySQL, PostgreSQL, Teradata, MongoDB, Cassandra.
Reporting Tools: MS Office (Word/Excel/Power Point/ Visio), Tableau, Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x/ 6.x, Cognos7.0/6.0.
ETL Tools: Informatica Power Centre, SSIS.
Version Control Tools: SVN, GitHub
BI Tools: Tableau, Amazon Redshift, Power BI
Operating System: Windows, Linux, Unix, Macintosh HD, Red Hat
PROFESSIONAL EXPERIENCE:
Confidential
Developer
Responsibilities
- Involved in data transformation using Python and R.
- Used Python to visualize the data and implemented machine learning algorithms.
- Developed and designed an API (RESTful Web Service) for the chatbot integration.
- Worked on voice to text and text to voice services using Python for voice conversation data using Google Cloud Platform.
- Built CI/CD pipelines using Docker, Jenkins and Marathon
- Worked on message data storage in Kafka and Cassandra
- Coded scala programs to process message streams
- Perform Data Cleaning, features scaling, features engineering using python.
- Create Data Quality Scripts using SQL to validate successful data load and quality of the data. Create various types of data visualizations using Python and R.
- Monitored the application using App dynamics and Google Analytics.
- Created REST API to make models production ready.
- Communicate the results with operations team for taking best decisions.
- Collect data needs and requirements by Interacting with the other departments.
- Managed companies’ virtual servers at Amazon EC2, S3.
- Worked on web traffic analytics using Google web analytics.
- Automated the daily and weekly build process to allow us to build daily builds twice a day for faster turnaround time.
- Automated the code release process, bring the total time for code releases from 20 hours to 10 minutes
- Developed a fully automated continuous integration system using Bitbucket, Docker, Jenkins and custom tools developed in Python and Bash.
- Implemented a review process in integration, re-training and re-deployment automation using Python programs that eliminated the need for an hourly monitoring
Environment: Python 3
Confidential
Programmer Analyst
Responsibilities
- Worked on Oracle database object creation and stored procedures
- Worked on Python automation scripts and reporting modules
- Worked on chatbot development for providing relevant product information to the customers
Confidential
Data analytics engineer
Responsibilities
- Developed Python Pyspark programs for big data processing (Processed large number of zipped data files 3.03 TB each)
- Developed shell scripts for process automation, shell scripts to set up virtual machines on Azure, for data retrieval from blob storage on Azure, for running production jobs and for data transfer between virtual machines using Azure file share
- Designed and developed databases, tables and indexes on Microsoft SQL server 2016 for storing processed data on Azure Windows VM
- Worked on Azure Analysis services
- Coded scala programs for processing tick history files
- Migrate on-premise infrastructure to Azure and set-up Microsoft High Performance Computing cluster on Azure cloud
- Migrated the data from Azure ubuntu VM and ETL to Microsoft SQL server 2016
- Built-up and configured server cluster (Ubuntu)
- Set up Azure SQL Database and Data Warehouse
- Implemented discretization and binning, data wrangling: cleaning, transforming, merging and reshaping data-frames
- Tuned Spark cluster for achieving maximum parallelization
- Study and understanding of the business and its functionalities by communication with Business Analysts.
- Analyzed the existing database for performance and suggested methods to redesign the model for improving the performance of the system.
- Supported ad-hoc, standard reporting.
- Designed and implemented many standard processes that are maintained and run on a scheduled basis.
- Created reports using MS Access and Excel. Applying filters to retrieve best results.
- Developed the Stored Procedures, SQL Joins, SQL queries for data storage and retrieval, accessed for analysis and exported the data into CSV, Excel files.
- Developed Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Analyzed business requirements, system requirements data mapping requirement specifications.
- Documented functional requirements and supplementary requirements in Quality Center.
- Tested Complex ETL Mappings and sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse project.
- Developed SQL scripts, stored procedures, and views for data processing, maintenance etc., and other database operations.
- Performed the SQL Tuning and optimized the database and created the technical documents.
- Imported the Excel Sheet, CSV, Delimited Data, advanced excel features, ODBC compliance data sources into database for data extractions, data processing, and business needs.
- Designed and optimized the SQL queries and exported the data into database server.
Confidential
Research Assistant
Responsibilities:
- Designed and developed Machine learning enabled intent-based chatbot (NLP and NLU Tools and Technologies) using Python 3, spaCY, support vector machine and Flask development framework.
- Worked on web scrapping using Beautiful soup.
- Built application interface and scripts using OO designing, UML modeling and dynamic data structures
- Worked with Dialogflow and Microsoft Azure Bot Framework to explore Bot Features.
- Chatbot is an intent-based chatbot which makes use of NLP library spaCY to convert user's free-form text to sentence-vectors.
- Used Cosine-similarity as similarity measure between user's text and text present in the training-set.
- Created entity and intent training sets to train machine learning models in JSON format.
- Used machine learning techniques to match user's text to text intent.
- Used rasa-NLU as the library for chatbot development.
- Developed Chatbot's knowledge-base (engine) and Client application using Python 3.
- Chatbot also makes use of Wikipedia Web API or Wolfram Alpha Artificial Intelligence Web API Engine to support responses.
- Chatbot's front-end is developed using Flask Development Framework and Angular.js
- Worked on database queries to avoid SQL injection attacks
- Worked on MySQL server for storing bot conversation data
Confidential
Database DBA
Responsibilities:
- Involved in managing data inventory on Oracle and DB2 databases.
- Involved in performance tuning for faster data retrieval.
- Worked on data migration from different data sources.
- Coded programs for networking and telecom data retrieval and processing.
Confidential
Test Lead
Responsibilities:
- Worked on health care claim processing modules.
- Insured the quality of health-care claims data.
- Mentored junior resources on program design and application architecture
Confidential
Software Engineer
Responsibilities:
- Maintained Fidelity’s network modules written in Python.
- Worked on Fidelity intranet written using Flask and Django.
- Worked on Bankway product modules (Fidelity’s Banking Suite) enhancement and maintenance.
Capgemini
Associate Consultant
Responsibilities:
- Developed a fully automated continuous integration system using Git, DB2 LUW and custom tools developed in Python and Bash.
- Implemented an automated bank letters data fetching process in Python to publish bank letters in off-hours with check-point mechanism and auto-mated e-mail for process completion to the stake-holders.
- Coded reporting system programs from legacy systems to Python.
Confidential
Member Technical
Responsibilities:
- Conducted cost benefit analysis of various ETL tools and technologies.
- Created UNIX shell scripts and processes to extract data from various sources such as DB2 and Oracle.
- Implemented complex transformations by creating custom transformations in addition to using built-in ones.
- Oversaw unit and system tests and assisted users with acceptance testing.
- Conducted peer design and code reviews and extensive documentation of standards, best practices, and ETL procedures.
- Played role in design of scalable, reusable, and low maintenance ETL templates.
- Worked with logical and physical designs for large databases.
- Worked on performance tuning of database objects and analyzed the Oracle and DB2 queries and tuned the queries using multi-row fetch cursors and using other tuning techniques, used temporal tables.
- Worked to temporal database tables.
- Worked on technical data conversions between Autopay and Workaday systems.