We provide IT Staff Augmentation Services!

Python Developer Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Data Engineer with around 8 years of experience in interpreting and analysing sophisticated data sets and expertise in providing business insights.
  • Expertise in Python 3.5.1 and Unix shell scripting and have created numerous scripts for the project needs reducing manual effort.
  • Very good expertise in MySQL5.7, Oracle 10g, PostgreSQL 9.4 and Teradata 15.10.07.14 .
  • Good expertise with creating and handling PL/SQL concepts like Stored Procedures, Cursors, Triggers and Exception Handlers.
  • Hands on experience in processing stages in Pentaho (PDI) 6.1 tool and good exposure to Database connectors.
  • Experience with development of enterprise data warehouse applications in Telecommunications, Financial and Human Resources sectors.
  • Experience in AGILE Software Development Lifecycle (SDLC) - Requirement Gathering, analysis, design, development, maintenance, build, code management and testing of enterprise data warehouse applications and sophisticated ETL processes.
  • Good experience in ETL, Data warehousing and data integration concepts.
  • Good exposure with trouble shooting, de-bugging and performance tuning skills.
  • Quick learner, formidable team player and willing to grow with the organization.
  • Good exposure on data analytics with Hadoop and big data architecture.
  • Sound experience in statistical tool like R and MS-Excel.

TECHNICAL SKILLS:

Data Integration skills: Pentaho kettle 6.1

Scripting Language: Python 3.5.1 and Shell scripting

Scheduling Tools: Crontab

Databases: MySQL 5.7.16, Oracle 10g, PostgreSQL 9.4 and Teradata 15.10.07.14

OS: Redhat Enterprise Linux(6.2), LINUX(CentOS 5.10)

Tools: Eclipse for Python, MySQL Query Browser, MS-Excel, XML, pgAdmin 4.1, Teradata SQL Assistant 15.10, Spyder 3.0, JIRA

Programming Languages: C, C++ and Python 2.7.10

Monitoring Tool: Zabbix2.4.5

Big data tool: Hortonworks hadoop

Code Repository: SVN, Git

Statistical Tool: R(R Studio 1.1.463)

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Python Developer

Environment: Red Hat Enterprise Linux (6.2)

Languages : Python 3.5.1, Teradata 15.10.07.14 & Unix Shell Script

Responsibilities:

  • Developing new codes in Python 3.5.1 and Unix shell scripting based on the requirements. It involves usage of both built-in and user defined functions, classes and objects.
  • Integration of new jobs i.e. automation and execution of Teradata DDL, DML queries and stored procedures inside Python scripting.
  • Maintaining, logging and fixing codes of the entire application in case of issues as a part of application support.
  • Worked on the code that connects to Salesforce REST API, fetch data in the form of JSON tags, generate reports out of it and load them to SFTP.
  • Worked on a set of framework scripts that was entirely written on Unix shell script and Python which has various utilities that supports running several projects inside data management.
  • Scheduling and maintaining jobs in Crontab.
  • Gained hands on knowledge on 1-Automation scheduling tool.

Confidential, Sunnyvale, CA

ETL Technical Analyst

Environment: Pentaho PDI 6.1

Languages : PostgreSQL 9.4

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analyse the given requirements, document, create trello cards which is helpful in task tracking for self and other developers.
  • Prepare the technical design, create table schemas and start designing ETL jobs in Pentaho(PDI) tool 6.1 and PostgreSQL 9.4.
  • PostgreSQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Involved in monitoring of data integration jobs and scripts on daily basis. Bug fixing in case of any issue.
  • Created PDI jobs which runs various data validations and publishes the results to the team to ensure data availability and integrity.
  • Documented the end-end process and data flow in the entire data warehouse in a short span of time which was not available before.

Confidential, Troy, MI

Data Analyst

Environment: Pentaho data integration Tool (Version: 4.2.0), Eclipse

Languages : MySQL 5.7.16, Linux shell scripting, Python2.7.10

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analyse the given requirements.
  • Prepare the technical design, create table schemas and start code generation based on the PRD. This involves MySQL 5.7.16, shell and Python2.7.10 scripting.
  • MySQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Code development is done in Python which actually automates MySQL queries and using both built in functions and user defined functions, classes and objects.
  • Involved in unit testing by preparing test cases.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Pentaho data integration Tool (Version: 4.2.0), Eclipse

Languages : MySQL 5.7.16, Linux shell scripting, Python2.7.10

Responsibilities:

  • Understand and analysis of business and process requirements.
  • Preparing low level design document.
  • Constructed jobs based on the requirement in Pentaho Kettle tool.
  • Involved in unit testing by preparing test cases.
  • Developed an automated Shell script which will take care of data transfer across master and slave servers, in case of any failure in ETL tool/process.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse

Languages : Python 2.7.10, MySQL 5.7.16, Linux Shell scripting

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analysis of the given requirements.
  • Prepare the table schema and start code generation based on the PRD. Involves both shell and python scripting.
  • Code generation involves both new code generation and optimization of existing/old codes in Python 2.7.10 and MySQL 5.7.16.
  • MySQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Code development is done in Python2.7.10 which actually automates MySQL queries and using both built in functions and user defined functions, classes and objects.
  • Worked on codes that parses XML and munch data that will put into tables for end users.
  • Involved in unit testing by preparing test cases.
  • In case of code optimizations, new codes have to be put in production like environment, wherein codes will be run at several instances like existing and results will be cross verified.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: R Studio

Languages : R and Python

Responsibilities:

  • Connect to a live social media(twitter) data stream, extract and store data in the form of CSV files.
  • Process the data in R-Studio; restructure, filter and provide useful insights from it.
  • Do sentiment analysis by comparing the sentiments of people on a particular item or subject.
  • Provide visualization of sentiment analytics.
  • Generate graphs, plots based on the data fetched or mined.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse

Languages : Python 2.7, MySQL 5.7.16, Pentaho Kettle, Crontab Scheduler.

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analysis of the given requirements.
  • Prepare technical design, create table schema and start code generation based on the PRD. It involves MySQL, Python and Shell scripting.
  • MySQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Code development is done in Python2.7.10 which actually automates MySQL queries and using both built in functions and user defined functions, classes and objects.
  • Involved in unit testing by preparing test cases.
  • Once all the tables are readied, ETL jobs for each and every table have to be created in Pentaho Kettle tool, to replicate those data in slave servers.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse

Languages : Python 2.7, MySQL 5.7.16, Pentaho Kettle 4.2.0, Crontab Scheduler

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analysis of the given requirements.
  • Co-ordinate with the departments that deals with lead providers, data providers, marketing agencies, collections agencies, merchant service providers and call centre. All data and their corresponding logics have to be received from them.
  • Prepare technical design, create table schema and start code generation based on the PRD. This involves both MySQL, Shell and Python scripting.
  • MySQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Code development is done in Python2.7.10 which actually automates MySQL queries and using both built in functions and user defined functions, classes and objects.
  • Cross check the data with the financial numbers, cleanse the data in case of issues, analyse it and even present reports to financials if needed.
  • Create ETL jobs for all tables in Pentaho Kettle tool, so that the data can be replicated to the slave servers.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse

Languages : Python 2.7, MySQL 5.7.16, Pentaho Kettle 4.2.0, Crontab Scheduler

Responsibilities:

  • On daily basis, the Confidential website is crawled using a Python script and the conversion values are stored to a table in database.
  • Create an ETL job in Pentaho Kettle tool, so that the data can be replicated to the slave servers.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse

Languages : Python 2.7, MySQL 5.7, Crontab scheduler and Zabbix2.4.5

Responsibilities:

  • Cross checking the master data with the production variables to make sure the data is reliable and noise-free for its end users.
  • Developed an automated script to check the cross check the dataset on the daily basis and publish a report to end users.
  • Also these results are fed into Zabbix2.4.5 tool, which triggers SMS alerts for people away and alarms the productions support people.
  • It involves, figuring out the tweaks or issues that emerge during data transformations, alert the developers to fix it then and there.
  • Estimation of impact that the bug/issue will create on key metrics.
  • Documenting and presenting the issues found to the end users.

Confidential

Data Analytics Engineer

Environment: Exact Target(ET) portal, Eclipse

Languages : Python 2.7, MySQL 5.7.1, Pentaho Kettle 4.2.0, Crontab Scheduler

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analyse the given requirements.
  • Create data schema, SQL query and jobs in ET portal which fetches the table data from ET servers and downloads it into the SFTP.
  • Prepare the table schema and start code generation based on the PRD. Involves all MySQL, Python and Shell scripting.
  • Create ETL jobs for all tables in Pentaho Kettle tool, so that the data can be replicated to the slave servers.
  • Once the code, jobs are ready, schedule them in Crontab.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse, Gitlab

Languages : Python 2.7, MySQL 5.7.1 and XML

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analysis of the given requirements.
  • Prepare the table schema and start code generation based on the PRD. Involves Python, Shell and PostgreSQL scripting.
  • Involved in unit testing by preparing test cases.
  • In case of code optimizations, new codes have to be put in production like environment, wherein codes will be run at several instances as like existing and the results will be cross verified.
  • Documenting the entire process/code for future reference.

Confidential

Data Analytics Engineer

Environment: Eclipse, Gitlab

Languages : Python 2.7, MySQL 5.7.1 and XML

Responsibilities:

  • Gathering requirements from the end users to get to know their needs. Understand and analysis of the given requirements.
  • Prepare the table schema and start code generation based on the PRD. Involves Python, Shell and MySQL scripting.
  • MySQL scripting involves creation of datasets using DDL, DML queries and writing PL/SQL Stored Procedures which execute set of queries as per the design.
  • Code development is done in Python2.7.10 which actually automates MySQL queries and using both built in functions and user defined functions, classes and objects.
  • In case of code optimizations, new codes have to be put in production like environment, wherein codes will be run at several instances as like existing and the results will be cross verified.
  • Documenting the entire process/code for future reference.

We'd love your feedback!