We provide IT Staff Augmentation Services!

Aws Data Engineer Resume

2.00/5 (Submit Your Rating)

Glendale, CA

SUMMARY

  • 7+ years of experience in software development which includes Design and Development of Enterprise and Web - based applications.
  • Hands-on technical experience in Python, Java, Q++(Mastercraft), DB2 SQL, R programming with primary exposure to P&C Insurance domain.
  • Experience with Amazon Web Services (Amazon EC2, Amazon S3, Amazon RDS, Amazon Elastic Load Balancing, Amazon SQS, AWS Identity and access management, Amazon SNS, AWS Cloud Watch, Amazon EBS, Amazon Cloud Front, VPC, DynamoDB, Lambda and Redshift)
  • Worked on various projects like storage prediction, price prediction, speech recognition, text classification and hand-written digit recognition by using R, Python, and NLP, Data analysis, management, neural networks and programming skills.
  • Experience in web/application development by using Python, Django, HTML, XML, CSS, JavaScript, jQuery, MySQL, PostgreSQL and SQLite.
  • Architecture using server-side applications Django and Flask.
  • Working knowledge on Kubernetes to deploy scale, load balance, and manage Docker containers
  • Good knowledge in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Data Transformation Services (DTS).
  • Experience in Database Design and development with Business Intelligence using SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), OLAP Cubes, Star Schema and Snowflake Schema.
  • Data Ingestion to Azure Services and processing the data in InAzure Databricks.
  • Creating and enhancing CI/CD pipeline to ensure Business Analysts can build, test, and deploy quickly.
  • Hands on in design, develop, test and implementation of web development Python, Django, HTML, XML, CSS, JavaScript, Bootstrap, jQuery, JSON and, AngularJS and Nodejs.
  • Building Data Warehouse using Star and Snowflake schemas.
  • Extensive knowledge on Exploratory Data Analysis, Big Data Analytics using Spark, Predictive analysis using Linear and Logistic Regression models and good understanding in supervised and unsupervised algorithms.
  • Worked on different statistical techniques like Linear/Logistic Regression, Correlational Tests, ANOVA, Chi-Square Analysis, K-means Clustering.
  • Hands-on experience on Visualizing the data using Power BI, Tableau, R(ggplot), Python (Pandas, matplotlib, NumPy, SciPy).
  • Experience in Python software development using libraries like libraries- Beautiful Soup, Python-twitter, NumPy, Pandas data frame, urllib2, SciPy, network, Unit Test, matplotlib, MySQL dB for database connectivity and IDEs - Spyder, sublime text, emacs, PyCharm.
  • Experienced in design, management and visualization of databases using Oracle and MySQL.
  • Deep understanding of front-end engineering principals and experience with front end technologies and frameworks.
  • Proficient in SQL databases MSSQL Server, MySQL (RDBMS), Oracle DB, Postgres, DynamoDB and MongoDB.
  • Different testing methodologies like unit testing, Integration testing, web application testing, selenium testing was performed.
  • Good Knowledge in writing Data Analysis expression (DAX) in Tabular data model.
  • Hands on knowledge in designing Database schema by achieving normalization.
  • Proficient in all phases of Software Development Life Cycle (SDLC) including Requirements gathering, Analysis, Design, Reviews, Coding, Unit Testing, and Integration Testing.
  • Well versed with Scrum methodologies.
  • Analyzed the requirements and developed Use Cases, UML Diagrams, Class Diagrams, Sequence and State Machine Diagrams.

TECHNICAL SKILLS

Languages: Python, Java, R, C, C++

Tools: Pycharm, Visual Studio, R Studio, Power BI, Tableau, SAS Studio, Gephi, Eclipse, Putty, Mainframes, Excel, Jupyter Notebook, Azure Databricks

Operating System: Windows, Unix, Linux

Databases: Oracle, MySQL, SQL, NoSQL (MongoDB), PostgreSQL

Methodologies: Waterfall, Agile

Cloud Services: Amazon Web Services (AWS)

PROFESSIONAL EXPERIENCE

Confidential, Glendale, CA

AWS Data Engineer

Responsibilities:

  • Developed Restful Micro Services using Django and deployed on AWS servers using EBS and EC2.
  • Worked on Flask and Snowflake queries.
  • Used Jenkins for continuous integration and delivery platform over GIT.
  • Automated most of the daily task using python scripting.
  • Involved in the CI/CD pipeline management for managing the weekly releases.
  • Built web application by using Python, Django, AWS, J2EE, PostgreSQL, MySQL, Oracle, and MongoDB.
  • Experience in developing test automation framework scripts using Python Selenium WebDriver.
  • Making recommendations to the team in terms of appropriate testing techniques, shared testing tasks
  • Defined different Django API profiling techniques for faster rendering information.
  • Later Migrated applications from Django to Flask and NoSQL (DynamoDB) to SQL(Snowflake)
  • Used Jenkins, AWS, Bitbucket environments.
  • Worked on Automating, Configuring and deploying instances onAWS, Azure environments and Data centers, also familiar withEC2,Cloud watch,Cloud Formationand managing security groups onAWS.
  • Hands on in design, develop, test and implementation of web development Python, Django, HTML, XML, CSS, JavaScript, Bootstrap, jQuery, JSON and, AngularJS and Nodejs.
  • Troubleshooted Production issues pertaining to AWS Cloud Resources and Application Infrastructure point of view.
  • Developed Shell scripts for cleansing and validating data files using utilities like AWK, sed, grep and other UNIX commands.
  • Developed Shell scripts implementing PL/SQL queries for data migration & batch processing.
  • Worked with JSON based REST Web services.
  • Restful web services using Python REST API Framework.
  • Working on improvement of existing machine learning algorithms to extract data points accurately.
  • Responsible for setting up Python REST API framework using Flask.
  • Engaged in Design, Development, Deployment, Testing and Implementation.

Environment: Python 3.6, Django Framework 1.3, Flask Framework, AngularJS, CSS, DynamoDB, MySQL, HTML5/CSS, Snowflake, Amazon Web Service (AWS), Azure, S3, EC2, PyCharm, Microsoft Visual Code, Linux, Shell Scripting.

Confidential - New York, NY

AWS Data Engineer

Responsibilities:

  • DevelopedPython/Django application for Google Analytics aggregation and reporting.
  • Used Django configuration to manage URLs and application parameters.
  • Worked onPythonOpen stack APIs.
  • UsedPythonscripts to update content in the database and manipulate.
  • Hands on experience with IAM to set up user roles with corresponding user and group policies using JSON and add project users to the AWS account with multi factor autantication enabled and least privilege permissions.
  • Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
  • Experience with EC2, Cloud Watch, Elastic Load Balancing and managing securities on AWS.
  • Used AWS lambda to run code virtually. queries from Python using Python -MySQL connector and MySQL database package.
  • Added support for Amazon AWS S3 and RDS to host files and the database into Amazon Cloud.
  • Designed high availability environment for Application servers and database servers on EC2 by using ELB and Auto-scaling.
  • Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ERdiagrams and used SQLqueries to filter data
  • Worked on data pre-processing and cleaning the data to perform feature engineering and performed data imputation techniques for the missing values in the dataset using Python.
  • Designed front end and backend of the application utilizing Python on Django Web Framework.
  • Configuring and Networking of Virtual Private Cloud (VPC), Cloud Front.
  • Extensive experience focusing on services like IAM and S3.

Environment: Python 2.6/2.7, JavaScript, Django Framework 1.3, CSS, SQL, MySQL, jQuery, Apache web server, GitHub, PostgreSQL, Amazon Web Service (AWS), Azure, S3, EC2, EBS, PyCharm, Microsoft Visual Code, Linux, Shell Scripting.

Confidential

Data Engineer

Responsibilities:

  • Responsible for gathering requirements, system analysis, design, development, testing and deployment. developed tools for collecting or extracting data maintained by hospitals, healthcare, providers, or federal and state agencies. collected and analyzed information, such as medical records, insurance claims, or billing information and tan identify patterns or trends in the data and recommend ways to use dat information to improve industry efficiency and performance.
  • Developed and tested many features for dashboard using React, Next, Flask, CSS, and JavaScript
  • Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used Agile methodology for developing applications.
  • Upgraded existing UI with HTML, CSS, jQuery and Bootstrap.
  • Working as an application developer experienced with controllers, views and models in Django.
  • Used SaltStack to configure and manage the infrastructure.
  • Restful web services using Python REST API Framework.
  • Used AWS Cloud Watch to monitor and store logging information
  • Implemented the application using Python Spring IOC (Inversion of Control), Django Framework and handled the security using Python Spring Security.
  • Participated in the complete SDLC process and created business Logic using Python/Django.
  • Created database using MySQL, wrote several queries to extract data from database.
  • Designed and Developed Restful web-services for both consumer and producer using Django.
  • Used Django framework for application development.
  • Created database using MySQL, wrote several queries and Django API's to extract data from database.
  • Worked with JSON based REST Web services.
  • Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
  • Developed microservices using AWS EMR Lambda, API Gateway, DynamoDB, RDS according to the scenario.
  • Data Ingestion to Azure Services and processing the data in InAzure Databricks.
  • Responsible for setting up Python REST API framework using Flask.
  • Wrote scripts inPythonfor extracting data from HTML file.
  • Effectively communicated with the external vendors to resolve queries.
  • Used Git for the version control.

Environment: python, Django, HTML5, CSS, Bootstrap, JSON, JavaScript, RESTful webservice, MySQL, SQLite, Cassandra, AWS (EC2, S3), PyUnit, Jenkins.

Confidential

Data Engineer

Responsibilities:

  • Identifying the business rules dat are implemented in the complete project and documenting them. The information is being used by Business team to write up the requirements.
  • Validating the UI specs and correcting them to make sure dat they are in line with the existing system.
  • Responsible for getting updates from the onshore team, conducting stand up meetings and providing the updates to the Scrum Master.
  • Designed and developed the UI of the website using HTML, AJAX, CSS and JavaScript.
  • Implemented SQL Alchemy, which is aPythonlibrary for complete access over SQL.
  • Moved the mappings from development environment to test environment.
  • Designed ETL Process using Informatica to load data from Flat Files, and Excel Files to target Oracle Data Warehouse database.
  • Interacted with the business community and database administrators to identify the Business requirements and data realties.
  • Analyzing and documenting the details on various external reports which are obtained from external systems within the company and third-party vendors.
  • Responsible for tracking the story updates in RTC which helps to prepare the burndown chart which provides the graphical representation to clients on the work left to do versus time.
  • Implemented and enhanced CRUD operations for the applications using the MVC (Model View Controller) architecture of Django framework and Python conducting code reviews.
  • Wrote Python modules to extract/load asset data from the MySQL source database.
  • Analyzed the requirements and developed Use Cases, UML Diagrams, Class Diagrams, Sequence and State Machine Diagrams.

Environment: python, Django, JavaScript, RESTful webservice, MySQL, PyUnit, Jenkins, RTC, Visio

We'd love your feedback!