Aws Data Engineer Resume
Glendale, CA
SUMMARY
- 7+ years of experience in software development which includes Design and Development of Enterprise and Web - based applications.
- Hands-on technical experience in Python, Java, Q++(Mastercraft), DB2 SQL, R programming wif primary exposure to P&C Insurance domain.
- Experience wif Amazon Web Services (Amazon EC2, Amazon S3, Amazon RDS, Amazon Elastic Load Balancing, Amazon SQS, AWS Identity and access management, Amazon SNS, AWS Cloud Watch, Amazon EBS, Amazon Cloud Front, VPC, DynamoDB, Lambda and Redshift)
- Worked on various projects like storage prediction, price prediction, speech recognition, text classification and hand-written digit recognition by using R, Python, and NLP, Data analysis, management, neural networks and programming skills.
- Experience in web/application development by using Python, Django, HTML, XML, CSS, JavaScript, jQuery, MySQL, PostgreSQL and SQLite.
- Architecture using server-side applications Django and Flask.
- Working noledge on Kubernetes to deploy scale, load balance, and manage Docker containers
- Good noledge in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Data Transformation Services (DTS).
- Experience in Database Design and development wif Business Intelligence using SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), OLAP Cubes, Star Schema and Snowflake Schema.
- Data Ingestion to Azure Services and processing teh data in InAzure Databricks.
- Creating and enhancing CI/CD pipeline to ensure Business Analysts can build, test, and deploy quickly.
- Hands on in design, develop, test and implementation of web development Python, Django, HTML, XML, CSS, JavaScript, Bootstrap, jQuery, JSON and, AngularJS and Nodejs.
- Building Data Warehouse using Star and Snowflake schemas.
- Extensive noledge on Exploratory Data Analysis, Big Data Analytics using Spark, Predictive analysis using Linear and Logistic Regression models and good understanding in supervised and unsupervised algorithms.
- Worked on different statistical techniques like Linear/Logistic Regression, Correlational Tests, ANOVA, Chi-Square Analysis, K-means Clustering.
- Hands-on experience on Visualizing teh data using Power BI, Tableau, R(ggplot), Python (Pandas, matplotlib, NumPy, SciPy).
- Experience in Python software development using libraries like libraries- Beautiful Soup, Python-twitter, NumPy, Pandas data frame, urllib2, SciPy, network, Unit Test, matplotlib, MySQL dB for database connectivity and IDEs - Spyder, sublime text, emacs, PyCharm.
- Experienced in design, management and visualization of databases using Oracle and MySQL.
- Deep understanding of front-end engineering principals and experience wif front end technologies and frameworks.
- Proficient in SQL databases MSSQL Server, MySQL (RDBMS), Oracle DB, Postgres, DynamoDB and MongoDB.
- Different testing methodologies like unit testing, Integration testing, web application testing, selenium testing was performed.
- Good Knowledge in writing Data Analysis expression (DAX) in Tabular data model.
- Hands on noledge in designing Database schema by achieving normalization.
- Proficient in all phases of Software Development Life Cycle (SDLC) including Requirements gathering, Analysis, Design, Reviews, Coding, Unit Testing, and Integration Testing.
- Well versed wif Scrum methodologies.
- Analyzed teh requirements and developed Use Cases, UML Diagrams, Class Diagrams, Sequence and State Machine Diagrams.
TECHNICAL SKILLS
Languages: Python, Java, R, C, C++
Tools: Pycharm, Visual Studio, R Studio, Power BI, Tableau, SAS Studio, Gephi, Eclipse, Putty, Mainframes, Excel, Jupyter Notebook, Azure Databricks
Operating System: Windows, Unix, Linux
Databases: Oracle, MySQL, SQL, NoSQL (MongoDB), PostgreSQL
Methodologies: Waterfall, Agile
Cloud Services: Amazon Web Services (AWS)
PROFESSIONAL EXPERIENCE
Confidential, Glendale, CA
AWS Data Engineer
Responsibilities:
- Developed Restful Micro Services using Django and deployed on AWS servers using EBS and EC2.
- Worked on Flask and Snowflake queries.
- Used Jenkins for continuous integration and delivery platform over GIT.
- Automated most of teh daily task using python scripting.
- Involved in teh CI/CD pipeline management for managing teh weekly releases.
- Built web application by using Python, Django, AWS, J2EE, PostgreSQL, MySQL, Oracle, and MongoDB.
- Experience in developing test automation framework scripts using Python Selenium WebDriver.
- Making recommendations to teh team in terms of appropriate testing techniques, shared testing tasks
- Defined different Django API profiling techniques for faster rendering information.
- Later Migrated applications from Django to Flask and NoSQL (DynamoDB) to SQL(Snowflake)
- Used Jenkins, AWS, Bitbucket environments.
- Worked on Automating, Configuring and deploying instances onAWS, Azure environments and Data centers, also familiar wifEC2,Cloud watch,Cloud Formationand managing security groups onAWS.
- Hands on in design, develop, test and implementation of web development Python, Django, HTML, XML, CSS, JavaScript, Bootstrap, jQuery, JSON and, AngularJS and Nodejs.
- Troubleshooted Production issues pertaining to AWS Cloud Resources and Application Infrastructure point of view.
- Developed Shell scripts for cleansing and validating data files using utilities like AWK, sed, grep and other UNIX commands.
- Developed Shell scripts implementing PL/SQL queries for data migration & batch processing.
- Worked wif JSON based REST Web services.
- Restful web services using Python REST API Framework.
- Working on improvement of existing machine learning algorithms to extract data points accurately.
- Responsible for setting up Python REST API framework using Flask.
- Engaged in Design, Development, Deployment, Testing and Implementation.
Environment: Python 3.6, Django Framework 1.3, Flask Framework, AngularJS, CSS, DynamoDB, MySQL, HTML5/CSS, Snowflake, Amazon Web Service (AWS), Azure, S3, EC2, PyCharm, Microsoft Visual Code, Linux, Shell Scripting.
Confidential - New York, NY
AWS Data Engineer
Responsibilities:
- DevelopedPython/Django application for Google Analytics aggregation and reporting.
- Used Django configuration to manage URLs and application parameters.
- Worked onPythonOpen stack APIs.
- UsedPythonscripts to update content in teh database and manipulate.
- Hands on experience wif IAM to set up user roles wif corresponding user and group policies using JSON and add project users to teh AWS account wif multi factor authentication enabled and least privilege permissions.
- Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
- Experience wif EC2, Cloud Watch, Elastic Load Balancing and managing securities on AWS.
- Used AWS lambda to run code virtually. queries from Python using Python -MySQL connector and MySQL database package.
- Added support for Amazon AWS S3 and RDS to host files and teh database into Amazon Cloud.
- Designed high availability environment for Application servers and database servers on EC2 by using ELB and Auto-scaling.
- Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ERdiagrams and used SQLqueries to filter data
- Worked on data pre-processing and cleaning teh data to perform feature engineering and performed data imputation techniques for teh missing values in teh dataset using Python.
- Designed front end and backend of teh application utilizing Python on Django Web Framework.
- Configuring and Networking of Virtual Private Cloud (VPC), Cloud Front.
- Extensive experience focusing on services like IAM and S3.
Environment: Python 2.6/2.7, JavaScript, Django Framework 1.3, CSS, SQL, MySQL, jQuery, Apache web server, GitHub, PostgreSQL, Amazon Web Service (AWS), Azure, S3, EC2, EBS, PyCharm, Microsoft Visual Code, Linux, Shell Scripting.
Confidential
Data Engineer
Responsibilities:
- Responsible for gathering requirements, system analysis, design, development, testing and deployment. developed tools for collecting or extracting data maintained by hospitals, healthcare, providers, or federal and state agencies. collected and analyzed information, such as medical records, insurance claims, or billing information and then identify patterns or trends in teh data and recommend ways to use dat information to improve industry efficiency and performance.
- Developed and tested many features for dashboard using React, Next, Flask, CSS, and JavaScript
- Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used Agile methodology for developing applications.
- Upgraded existing UI wif HTML, CSS, jQuery and Bootstrap.
- Working as an application developer experienced wif controllers, views and models in Django.
- Used SaltStack to configure and manage teh infrastructure.
- Restful web services using Python REST API Framework.
- Used AWS Cloud Watch to monitor and store logging information
- Implemented teh application using Python Spring IOC (Inversion of Control), Django Framework and handled teh security using Python Spring Security.
- Participated in teh complete SDLC process and created business Logic using Python/Django.
- Created database using MySQL, wrote several queries to extract data from database.
- Designed and Developed Restful web-services for both consumer and producer using Django.
- Used Django framework for application development.
- Created database using MySQL, wrote several queries and Django API's to extract data from database.
- Worked wif JSON based REST Web services.
- Added support for Amazon AWS S3 and RDS to host static/media files and teh database into Amazon Cloud.
- Developed microservices using AWS EMR Lambda, API Gateway, DynamoDB, RDS according to teh scenario.
- Data Ingestion to Azure Services and processing teh data in InAzure Databricks.
- Responsible for setting up Python REST API framework using Flask.
- Wrote scripts inPythonfor extracting data from HTML file.
- Effectively communicated wif teh external vendors to resolve queries.
- Used Git for teh version control.
Environment: python, Django, HTML5, CSS, Bootstrap, JSON, JavaScript, RESTful webservice, MySQL, SQLite, Cassandra, AWS (EC2, S3), PyUnit, Jenkins.
Confidential
Data Engineer
Responsibilities:
- Identifying teh business rules dat are implemented in teh complete project and documenting them. Teh information is being used by Business team to write up teh requirements.
- Validating teh UI specs and correcting them to make sure dat they are in line wif teh existing system.
- Responsible for getting updates from teh onshore team, conducting stand up meetings and providing teh updates to teh Scrum Master.
- Designed and developed teh UI of teh website using HTML, AJAX, CSS and JavaScript.
- Implemented SQL Alchemy, which is aPythonlibrary for complete access over SQL.
- Moved teh mappings from development environment to test environment.
- Designed ETL Process using Informatica to load data from Flat Files, and Excel Files to target Oracle Data Warehouse database.
- Interacted wif teh business community and database administrators to identify teh Business requirements and data realties.
- Analyzing and documenting teh details on various external reports which are obtained from external systems wifin teh company and third-party vendors.
- Responsible for tracking teh story updates in RTC which helps to prepare teh burndown chart which provides teh graphical representation to clients on teh work left to do versus time.
- Implemented and enhanced CRUD operations for teh applications using teh MVC (Model View Controller) architecture of Django framework and Python conducting code reviews.
- Wrote Python modules to extract/load asset data from teh MySQL source database.
- Analyzed teh requirements and developed Use Cases, UML Diagrams, Class Diagrams, Sequence and State Machine Diagrams.
Environment: python, Django, JavaScript, RESTful webservice, MySQL, PyUnit, Jenkins, RTC, Visio