Sr Python Developer Resume
Dallas, TX
SUMMARY
- 7+ years of experience as a Python Developer in IT Industry with proficiency in Design & Development of Enterprise Applications using, Python, PySpark, Django, Flask and SQL, Java/J2EE, flask projects.
- Experience in Data Analysis, Data mining with large data sets of Structured and Unstructured data, Data Acquisition, Data Validation, Predictive modeling, Data Visualization, Web Scraping. Adept in statistical programming languages using Python including Big Data technologies.
- Strong knowledge and hands - on experience of Hadoop Architecture and its components such as HDFS, Yarn, Spark, MapReduce, Hive, HBase, Sqoop, Flume and Kafka.
- Good experience using programming languages Scala and Python for Spark programming.
- Strong experience working with Spark framework to perform large scale data transformations.
- Good exposure to Spark core, Spark Data frame Api, Spark Streaming, Spark SQL, and Spark ML.
- Experience with Cloud Services like Google Microsoft Azure, Amazon Web Services.
- Good Experience in developing Spark streaming application and utilizing Kafka.
- Experience in using Python, Django, PHP, C++, CSS, HTML, XHTML, JavaScript, AngularJS, REACT and JSON for developing web applications.
- Strong experience working with relational databases like Oracle, SQLite, PostgreSQL, MySQL, DB2 and non-relational databases like MongoDB and Cassandra.
- Virtualized the servers usingDockerfor the test environments and dev-environments needs, also configuration automation usingDockercontainers.
- Strong experience on DevOps essential tools likeChef, Puppet, Ansible, Docker, Kubernetes, Subversion (SVN), GIT, Hudson, Jenkins, Ant, Maven and migrated VMWAREVMs to AWS and Managed Services like EC2, S3, Route53, ELB, EBS.
- Knowledge about setting upPythonREST API Framework using Django.
- Experience in using various version control systems like CVS, Git, GitHub and Amazon EC, AWS S3.
- Experience working on severalpythonpackages like Panda, NumPy, Beautiful Soup, SQLAlchemy, PyTables.
- Experience in handling database issues and connections with SQL and NoSQL databases like MongoDB (2.6, 2.4) by installing and configuring various packages inpython(Teradata, MySQL dB, MySQL connector, Pymongo and SQLAlchemy).
- Hands-on experience in processing large datasets with Spark using Django and PySpark.
- Experience maintaining CI environments with build automation tools likeJenkins and extensively used Jenkins to streamline CI/CD process, automated Master-Slave configurations using temporary slaves.
- Proficient in developing complex SQL Queries, Stored Procedures, Triggers, Cursors, Functions, and Packages along with performing operations on the database.
- Experienced in using agile methodologies including pair programming, SCRUM and Test- Driven Development (TDD).
- Flexible, enthusiastic and project-oriented team player with excellent communication skills with leadership abilities to develop creative solutions for challenging requirement of client.
TECHNICAL SKILLS
Languages: Python2.7/3.5, Java/J2EE, SQL, PL/SQL, SAS
Web Technologies: HTML5, JSP, XHTML, CSS3, Bootstrap, XML, JSON, JQuery, Ajax, WebServices, REST API's
Database Systems: MS SQL Server, MySQL, MongoDB, Cassandra, PostgreSQL, Oracle
Python Libraries: Pandas, Numpy, Shell, UnitTest, JSON, CSV, XLS, Perl, Bash
Frameworks: Django, Celery, Pyramid, MongoDB, AngularJS, Django Rest Framework, Flask, Hibernate, Spring MVC
RDBMS: Oracle 9i/10g, Sybase 12.5, DB2, MySQL
Development IDE: Eclipse, RAD, Pycharm, Net Beans
Version Control: Subversion, ClearCase, SVN, CVS, GIT
Operating Systems: Windows, UNIX, Linux, Mac OS X
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Sr Python Developer
Responsibilities:
- Worked in AWS environment, instrumental in utilizing Compute Services (EC2, ELB), Storage Services (S3, Glacier, Block Storage, Lifecycle Management policies), CloudFormation(JSON Templates), Elastic Beanstalk, Lambda, VPC, RDS, Trusted Advisor and Cloud Watch.
- Implemented SQL scripts and queries in Python code to handle User requests and work with the data in database.
- Developed web applications in Django Framework's model view control (MVC) architecture.
- Developed backend of an application usingPython, Django.
- Created Python scripts and tested to automateAWS S3 data upload/downloadand control instance operations withAWS API.
- Created and scheduled jobs in Autosys.
- Automated applications andMySQLcontainer deployment inDocker using Python.
- Built database Model, Views and API's usingPythonfor interactive web based solutions.
- Used Django-celery to create asynchronous tasks with RabbitMQ as messaging queue.
- Utilized Unit Test, thePythonUnit test framework, for allPythonapplications.
- Worked on Lambda functions the returns the data from incoming events and then stores the result in Amazon DynamoDB.
- Experience & expert level Understanding ofActimizeSAM, CTR solutions. (Rules, Back end Data model\flow) Worked on AWS data pipeline to configure data loads from S3 into RedShift
- Used JIRA for bug tracking and issue tracking and used Agile Methodology and SCRUM Process.
- Manage the configurations of multiple servers using Ansible.
- Worked on Swagger spec to create API using a specific JSON or YAML schema that outlines the names, order, and other details of the API.
- Created database using MySQL and wrote several queries to extract data from database.
- Deployed mircoservices2, including provisioning AWS environments using Ansible Playbooks.
- Designed and Developed User Interface using front-end technologies like HTML, CSS, JavaScript, JQuery, AngularJS,Bootstrapand JSON.
- Developed and tested many features in an agile environment usingPython, Django, HTML5, CSS, JavaScript and Bootstrap.
- Experience & expert level Understanding ofActimizeERCM. (UI features, Back end Data model\flow)
- Responsible for automating SOA testing.
- Monitoring SOA infrastructure and further enhancements.
- Developed Job dashboard monitor UI using Django/Flask.
- Develop REST API and integrated with cloud products likeAWSREST API gateway andAWS lambda..
- Develop remote integration with third party platforms by using Restful web services.
- Implemented Restful Web Services for the data transportation between multiple systems.
- Responsible for on boarding Application teams to build and deploy the code using GitHub, Jenkins.
- Responsible for fully understanding the source systems, documenting issues, following up on issues and seeking resolutions.
Environment: Python 3.7, MySQL, JavaScript, AWS, AngularJS, Bootstrap, Swagger, Git, GitHub, PySpark, Linux, Shell Scripting, AZURE, Postgres, JIRA.
Confidential, Austin, TX
Sr. Python Developer
Responsibilities:
- Written Spark applications to perform data cleansing, transformations, aggregations, and other useful datasets as per downstream team requirements.
- Involved in data cleansing, event enrichment, data aggregation, de-normalization and data preparation needed for downstream model learning and reporting.
- Written python scripts to parse XML documents and load the data in database.
- Developed python scripts in order to migrate information from Oracle to MongoDB.
- Developed SQL Queries, Stored Procedures, and Triggers Using Oracle.
- Good understanding of case management, alert generation, Actimizetool and its workflow
- Implemented SQL scripts and queries in Python code to work with various databases and data sources.
- Developed multiple Database Plugins for Cassandra, MySQL, Oracle, MSSql.
- Implemented Partitioning (both dynamic Partitions and Static Partitions) and Bucketing in HIVE.
- Designed and developed transactions and persistence layers to save/retrieve/modify data for application functionalities using Django.
- Written Python script to perform data transformation and data migration from various data sources and build different databases to store the Raw data and filtered Data.
- Developed remote integration with third party platforms by using Restful web services and Successful implementation of Apache Spark and Spark Streaming applications for large scale data.
- Experience & expert level Understanding ofActimizeSAM, CTR solutions. (Rules, Back-end Data model\flow)
- Responsible for on boarding Application teams to build and deploy the code using GitHub Jenkins, Nexus and Ansible.
- Work on DNS tables to map the website to its IP address and mapped them to AWS Route 53.
- Used AmazonRoute53 to manage DNSzones globally & to give public DNS names to ELB's and Cloudfront for Content Delivery.
- Experience in creatingDocker Containersleveraging existing Linux Containers and AMI's in addition to creatingDocker Containersfrom scratch.
- Deploying and maintaining production environment usingAWS EC2 instances and ECS with Docker.
- Designed and implemented a distributed, QA infrastructure for automated testing of a multi process software product.
- Used Django Database API's to access database objects.
- Experience & expert level Understanding ofActimizeERCM. (UI features, Back end Data model\flow)
- Implementing customer data collection with PySpark/Hadoop analytics.
- Designed and developed the UI of the website using HTML, AJAX, CSS and JavaScript.
- Use JavaScript andBootstrapfor page usefulness popup screens, Sale and discount tags for the products.
- Used S3 of Amazon Web Services (AWS) for improved efficiency of storage and fast access.
- Developed internal project in Flask to generate report from Google analytics on daily, monthly and weekly basis.
- Designed and built a reporting module that uses Apache Spark SQL to fetch and generate reports.
Environment: Python 3/2.7, PySpark, Django 1.8, Flask, Oracle, MySQL, PostgreSQL, Microservices, HTML5, CSS3, JQuery, JavaScript, AngularJS, AJAX, Bootstrap, GitHub, Linux, Shell Scripting, Apache Spark, Ansible, AWS, Mongo DB, Maven, Jenkins
Confidential, Boston, MA
Python Developer
Responsibilities:
- Responsible for gathering requirements, system analysis, design, development, testing and deployment.
- Worked on data extraction, aggregation, and analysis in HDFS by using PySpark and store the data needed to Hive.
- Worked extensively on enrichment/ETL in real time stream jobs using Spark Streaming, Spark SQLand loads in to HBase.
- Developed and tested Spark code using Scala for Spark Streaming/Spark SQL for faster processing of data.
- Developed entire frontend and backend modules using Python on Django Web Framework.
- Worked on designing, coding and developing the application in Python using Django MVC.
- Created a Handler function in Python using AWS Lambda that can invoke when the service is executed.
- Used Django Database API's to access database objects.
- Worked on integrating python with Web development tools and Web Services.
- Good understanding of case management, alert generation, Actimizetool and its workflow
- Written and executed various MYSQL database queries from python using Python MySQL connector and MySQL dB package.
- Worked on the design and implementation of real time streaming ingestion using Flume, Kafka andSpark Streaming.
- Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, systems administrators and production to ensure production deadlines are met.
- Designed User Interfaces using AngularJS, Node Js, JQuery, Bootstrap, JavaScript, CSS3, XML and HTML5.
- Used JQuery and Ajax calls for transmitting JSON data objects between frontend and controllers.
- Created a simple AWS Lambda function using python for deployment management in AWS.
- Used Python based GUI components for the front end functionality such as selection criteria
- Support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
- Involved in front end and utilized Bootstrap and AngularJS for page design.
- Developed web-based applications using Python, Django, XML, CSS, HTML, DHTML, JavaScript and JQuery.
- Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
- Created Business Logic using Python to create Planning and Tracking functions.
- Developed multi-threaded standalone app in Python and PHP to view performance.
Environment: Python 2.7, PySpark, MySQL, HTML, Django 1.8, Flask, HTML5, CSS, XML, MySQL, MS SQL Server, Ansible, JavaScript, AWS, Linux, Shell Scripting, Bootstrap, AJAX, JSON, Jenkins, Unix, Bootstrap, MongoDB, JQuery.
Confidential
Python Developer
Responsibilities:
- Hands-on experience with different scripting language like Python and shell scripts.
- Used Maven extensively for building jar files of MapReduce programs and deployed to Cluster.
- Responsible for Cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, managing, and reviewing data backups and Hadoop log files.
- Developed Hive Queries in Spark-SQL for analysis and processing the data. Used Scala programming to perform transformations and applying business logic.
- Developed Python based API (Restful Web Service) to track the events and perform analysis using Django.
- Involved in development using Python, bug fixing and unit testing of the layout commands.
- Dealt with development of parsers for handling JSON, XML responses and JAXB binding and worked with JMS (java messaging service) for asynchronous communication.
- Created a Python/Django based web application using Python scripting for data processing, MySQL for the database, and HTML/CSS/JQuery and High Charts for data visualization of the served pages.
- Developed automated process for builds and deployments by using Jenkins, Ant, Maven, Shell Script.
- Used Pandas API to put the data as time series, in a tabular format, for east timestamp data manipulation and retrieval.
- Published and Consumed Contract SOAP web services and developed corresponding test cases.
- Designed and managed API system deployment using fast http server and Amazon AWS architecture.
- Developed entire frontend and backend modules using Python on Django Web Framework.
- Used AWS for application deployment and configuration.
- Designed and developed the UI of the website using HTML, AJAX, CSS and JavaScript.
- Performed debugging and troubleshooting the web applications using Subversion version control tool to coordinate team-development.
Environment: Python 2.7, Spark SQL, AWS, Django, HTML5/CSS, MS SQL Server 2013, MySQL, JavaScript, Eclipse, Linux, Shell Scripting, JQuery, AJAX, GitHub, Jira