We provide IT Staff Augmentation Services!

Python/aws Developer Resume

5.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • Around 4 Years of Experience in software development in Python (libraries used: Beautiful Soup, NumPy, SciPy, matplotlib, markdown, JsonLogic, Report Lab, Pandas data frame, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, PyCharm along with of Experience in Analysis, Design, and Development of various stand-alone, Client-Server and Web Based Software applications using Python 3.7, Django, AngularJS, Node and Express. Worked with core AWS services (S3, EC2, ELB, EBS, Route53, VPC, Auto Scaling etc.), deployment services (Elastic Beanstalk, Ops Works and Cloud Formation) and security practices (IAM, Cloud Watch and Cloud Trail). Hands on experience in developing web applications implementing MVT/MVC architecture using Django, Flask, Webapp2 and spring web application frameworks. Experienced in MVC frameworks like Django, Angular JS, Java Script, jQuery and Node.js and Experience in IAM for creating roles, users, and groups to provide additional security to AWS account and its resources.

TECHNICAL SKILLS:

Programming Languages: Python 3.7/2.7, J2EE, C, C++, JavaScript

Databases: SQLite3, MSSQL, Mongo DB, Oracle 11g.

Database Tools: PL/SQL Developer, Toad, SQL Loader, Erwin.

Web Programming: HTML, CSS, DHTML, XML, Java Script.

Deployment tools: MS Azure, Heroku, Amazon Web Services (EC2, S3, EBS, ELB, SES).

Frameworks: Bootstrap, Django, Node.JS, Flask, Angular JS

Operating systems: Windows, Mac, Fedora Linux, Red hat Linux, EMC, Solaris

Technologies/Tools/IDE s: PyCharm, Eclipse, NetBeans, MS Visual Studio, RIDE, iPaaS

PROFESSIONAL EXPERIENCE:

Python/AWS Developer

Confidential - Irving, TX

  • Developed security policies and processes. Developed views and templates with Python and Django's view controller and templating language to create a user-friendly Website interface.
  • Designed and Developed SQL database structure with Django Framework using agile methodology. Developed project using Django, Oracle SQL, Angular, JavaScript, HTML5, CSS3 and Bootstrap.
  • Involved in the complete Software Development Life Cycle including gathering Requirements, Analysis, Design, Implementation, Testing and Maintenance.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Implemented user interface guidelines and standards throughout the development and maintenance of the Website using the HTML, CSS, JavaScript, jQuery and Angular Js.
  • Performed job functions using Spark API's in Scala for real time analysis and for fast querying purposes and Experienced with Agile methodology and delivery tool VersionOne
  • Used Python programming and Django for the backend development, Bootstrap and Angular for frontend connectivity and MongoDB for database.
  • Implemented a CI/ CD pipeline with Docker, Jenkins and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization
  • Experienced with AWS cloud platform and its features, which includes EC2, S3, ROUTE53 VPC, EBS, AMI, SNS, RDS AND CLOUD WATCH.
  • Used the AWS -CLI to suspend on Aws Lambda function used AWS CLI to automate backup of ephemeral data stores to S3 buckets EBS.
  • Gathered Semi structured data from S3 and relational structured data from RDS and keeping data sets into centralized metadata Catalog using AWS GLUE and extract the datasets and load them into Kinesis streams.
  • Worked as part of an Agile/Scrum based development team and exposed to TDD approach in developing applications.
  • Worked on designing and deploying a multitude application utilizing almost all the main services of the AWS stack (like EC2, S3, RDS, VPC, IAM, ELB, EMR Cloud watch, Route 53, Lambda and Cloud Formation) focused on high availability, fault tolerance environment.
  • Introduced new features and solved existing bugs by developing code for a cloud-based integration platform (iPaaS) and Migrated customer data from legacy iPaaS to AWS.
  • Deployed and tested different modules in Docker containers and GIT. Implemented programming automations using Jenkins and Ansible on Unix/Linux based OS over cloud like Docker.
  • AWS Kinesis Streams, AWS Step Functions (Serverless) Pipelines, AWS Kinesis Streams, Google TensorFlow, AWS Step Functions (Serverless) Pipelines, AWS Kinesis Streams, AWS Kinesis Streams Data Analytics Streaming SQL (AWS EKS) Pipelines.
  • Work as developer and support engineer where CA API Gateway expose API (Rest/SOAP) services of Home Depot to outside vendors.
  • Worked with different components of iPaaS solution Azure provides, Service Bus, Functions and Logic Apps to use connectors and create workflows.
  • Installed MongoDB, configured, setup backup, recovery, upgrade and tuning and data integrity. Responsible for managing MongoDB environment with high availability, performance and scalability perspectives. Extensive experience in deploying, managing and developing MongoDB cluster.
  • Extensive experience automating the build and deployment of scalable projects through Gitlab CI/ CD, Jenkins, etc. and Worked on Docker and Ansible. Used JavaScript’s for data validations and designed validations modules.
  • Created methods (get, post, put, delete) to make requests to the API server and tested Restful API using postman. Also used Loaded CloudWatch Logs to S3 and then load into Kinesis Streams for Data Processing.
  • Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets. Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.
  • Implemented Integration test cases and Developing predictive analytic using Apache Spark Scala APIs. and Used REST and SOAPUI for testing Web service for server-side changes.
  • Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
  • Wrote Python scripts to parse XML documents and load the data in database. Developed and designed an API (RESTful Web Services). Responsible for user validations on client side as well as server side.
  • Development of Python APIs to dump the array structures in the Processor at the failure point for debugging. Handling Web applications - UI security, logging, backend services.
  • Written functional API test cases for testing REST API’s with Postman and Integrated with Jenkins server to build scripts.
  • Representation of the system in hierarchy form by defining the components, subcomponents using Python and developed set of library functions over the system based on the user needs.
  • Usage of advance features like pickle/unpickle in python for sharing the information across the applications. Generated dynamic Pdf documents using Report Lab python library.
  • Used Python and Django creating graphics, XML processing, data exchange and business logic implementation with Spiff workflow development.
  • Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS).
  • Used Test driven approach (TDD) for developing services required for the application and Implemented Integration test cases and Developing predictive analytic using Apache Spark Scala APIs.
  • Designed and developed new reports and maintained existing reports using Microsoft SQL Reporting Services (SSRS) and Microsoft Excel to support the firm's strategy and management.

Python Developer

Confidential -San Francisco, CA

  • Prevailingly driven open source tools Spyder (Python) and R Studio(R) for statistical analysis and contriving the machine learning. Involved in defining the Source to Target data mappings, Business rules, and data definitions.
  • Predominant practice of PythonMat plot lib package and Tableau to visualize and graphically analyses the data. Data pre-processing, splitting the identified data set into Training set and Test set using other libraries in python.
  • Used existing UNIX shell scripts and modified them as needed to process SAS jobs, search strings, execute permissions over directories etc.
  • Implementing Spark MLLib utilities such as including classification, regression, clustering, collaborative filtering and dimensionality reduction.
  • Developed data transition programs from DynamoDB to AWS Redshift (ETL Process) using AWS Lambda by creating functions in Python for the certain events based on use cases.
  • Worked on AWS SQS to consume the data from S3 buckets. Imported the data from different sources like AWS S3, Local file system into Spark RDD.
  • Deployed Airflow (Celery Executor) on S3 instances mounted to EFS as central directory with broker as SQS and stored metadata in RDS and logs to S3 Buckets.
  • Integrated new tools and developed technology frameworks/prototypes to accelerate the data integration process and empower the deployment of predictive analytics by developing Spark modules with R.
  • Designed Data Marts by following Star Schema and Snowflake Schema Methodology, using industry leading Data Modeling tools like Erwin.
  • Synchronize data with server using SASS, JavaScript, Bootstrap, and Angular.js. and proficient in AWS services like VPC, EC2, S3, ELB, EMR, Autoscaling Groups (ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront.
  • Designed and Developed ETL jobs to extract data from Salesforce replica and load it in data mart in Amazon Redshift and managed Amazon redshift clusters such as launching the cluster by specifying the nodes and performing the data analysis queries.
  • Real time streaming the data using Spark with SQS. Responsible for handling Streaming data from web server console logs.
  • Written bash and python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI's and scheduling lambda functions for routine AWS tasks.
  • Implemented full CI/CD pipeline by integrating SCM (Git) with automated testing tool Gradle & Deployed using Jenkins and Dockized containers in production and engaged in few Devops tools like Ansible, Chef, AWS CloudFormation, AWS Code pipeline, Terraform and Kubernetes.
  • Written Terraform modules for automating the creation of VPC's and AWS EC2 Instances, modules for creation of VPC and VPN connection from Data Center to production environment and cross account VPC peering.
  • Created EC2 instances and implemented large multi node Hadoop clusters in AWS cloud from scratch using automated scripts such as terraform.
  • Design, coding, unit testing of ETL package source marts and subject marts using Informatica ETL processes for Oracle database.
  • Performed Source System Analysis, database design, data Modeling for the warehouse layer using MLDM concepts and package layer using Dimensional Modeling.
  • Involved in importing the real-time data to Hadoop using Kafka and implemented the Oozie job and Responsible for writing Unit Tests and deploy production level code through the help of Git version control.
  • Provided and created data presentation to reduce biases and telling true story of people by pulling millions of rows of data using SQL and performed Exploratory Data Analysis.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing and Worked on Spring framework applications and did integration to Microservices into existing system architecture
  • Design, coding, unit testing of ETL package source marts and subject marts using Informatica ETL processes for Oracle database.
  • Involved in different phases of Development life including Analysis, Design, Coding, Unit Testing, Integration Testing, Review and Release as per the business requirements.
  • Used Software development best practices for Object Oriented Design and methodologies throughout Object oriented development cycle.
  • Collaborate with data engineers to implement ETL process, write and optimized SQL queries to perform data extraction from Cloud and merging from Oracle 12c.
  • Data Modelling, Design Application Architecture and Design, create Project Development plan and Unit test plan, requirement gathering and analysis.
  • Developed various QlikView Data Models by extracting and using the data from various sources files, DB2, Excel, Flat Files and Big data.
  • Participated in all phases of data mining; data collection, data cleaning, developing models, validation, visualization and performed Gap analysis.
  • Analyzed large data sets apply machine learning techniques and develop predictive models, statistical models and developing and enhancing statistical models by leveraging best-in-class Modeling techniques.
  • Performed Data management like Merging, concatenating, interleaving of SAS datasets using MERGE, UNION and SET statements in DATA step and PROC SQL.
  • Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression and multivariate regression.

Software Associate

Confidential

  • Translated the customer requirements into design specifications and ensured that the requirements translate into software solution.
  • Developed and designed an API (RESTful Web Service). Used the Python language to develop web-based data retrieval systems.
  • Involved in doing AGILE (SCRUM) practices and planning of sprint attending daily agile (SCRUM) meetings and SPRINT retrospective meetings to produce quality deliverables within time.
  • Worked on Restful web services which enforced a stateless client server and support JSON few changes from SOAP to RESTFUL Technology Involved in detailed analysis based on the requirement documents.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL-Alchemy and PostgreSQL.
  • Created complex dynamic HTML UI using jQuery. Automated Regression analysis for determining fund returns based on index returns (Python/Excel). Worked on development of SQL and stored procedures, trigger and function on MYSQL.
  • Developed views and templates with Python and Django's view controller and templating language to create a user-friendly website interface.
  • Worked on JIRA tools for issue tracking, reporting versions, epics, sprints, etc. Used Test driven approach for developing the application and Implemented the unit tests using Python Unit test framework.
  • Used Python based GUI components for the front-end functionality such as selection criteria. Connected continuous integration system with GIT version control repository and continually build as the check-in's come from the developer
  • Involved in writing SQL queries implementing functions, triggers, cursors, object types, sequences, indexes etc. and Developed and tested many features for dashboard using Python, ROBOT framework, Bootstrap, CSS, and JavaScript.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using.
  • Developed MapReduce/Spark Python modules for machine learning & predictive analytics in Hadoop on AWS. Implemented a Python-based distributed random forest via Python streaming.
  • Responsible for delivering datasets from Snowflake to One Lake Data Warehouse and built CI/CD pipeline using Jenkins and AWS lambda and Importing data from DynamoDB to Redshift in Batches using Amazon Batch using TWS scheduler.
  • Created and managed all hosted or local repositories through Source Tree's simple interface of GIT client, collaborated with GIT command lines and Stash.
  • Developed and reviewed SQL queries with use of joins clauses (inner, left, right) in Tableau Desktop to validate static and dynamic data for data validation.
  • Designed and developed components using Python with Django framework. Implemented code in python to retrieve and manipulate data.
  • Involved in development of the enterprise social network application using Python, Twisted, and Cassandra and responsible for setting up Python REST API framework and spring frame work using Django.
  • Implemented data visualizations such as stacked bar charts, pie charts, density chart, and geographic visualizations with Tableau.
  • Actively participated in Object Oriented Analysis Design sessions of the Project, which is based on MVC Architecture using Spring Framework.
  • Maintained and developed complex SQL queries, stored procedures, views, functions and reports that meet customer requirements using Microsoft SQL Server 2008 R2.

We'd love your feedback!