We provide IT Staff Augmentation Services!

Python Developer Resume

4.00/5 (Submit Your Rating)

Ny, NY

SUMMARY:

  • Over 15 years of experience in ETL development utilizing large volumes of data.
  • Primary focus is on ETL, data migration, Python, and AWS.
  • Good communicator.
  • Over 15 years of experience in migration ETL, data architecture, manipulation and analysis in a BI/DW context.
  • 2 years of data engineering experience utilizing AWS, Python, Snowflake, and Redshift.
  • 2 years of experience in cloud ETL architecture, development, data processing, cost and storage analysis of data intensive workloads on AWS.

SKILLS:

Databases: Oracle 10G/11G/12c, Snowflake

Languages: SQL, PL/SQL, Python, Shell (ksh, bash), and JavaScript

MPP BI Appliance: Exadata V2, Netezza, Vertica, AWS Redshift

ETL: Talend, GETL ( Confidential ETL), Airflow

ER Tools: PowerDesigner

Version Control: CVS, Perforce, Git

Oracle: SQL*Loader, DataPump, VLDB, AWR, RAC, Exadata, SQL, PL/SQL

Python: Python 2.x, 3.x, wxPython 2.8, boto, PySpark, PyCharm

AWS: S3, EC2, Lambda, RDS, Redshift, Airflow, Athena, DynamoDB

Reporting: QlikView, Tableau

EXPERIENCE:

Confidential, NY, NY

PYTHON DEVELOPER

Responsibilities:

  • Architected and detailed Position, PnL, and FICC data migration process from multiple sources into Snowflake and Vertica from Sybase - IQ, SQL Server, and Oracle.
  • Implemented “Big Bang” migration or all position data from Sybase IQ.
  • Coded Snowflake data loaders using Python. Reorganized large volumes of data.
  • Created and maintained Position and PNL batch pipelines in parallel Prod.
  • Created Python ETL pipelines for teardown and “trickle” data migration from different backends to Snowflake, SQLServer, and Vertica.

Confidential, NJ

PYTHON DEVELOPER

Responsibilities:

  • Migrated and reorganized large volumes of data.
  • Created and maintained micro batch pipelines.
  • Analyzed cost, performance and storage analysis of data intensive workloads on AWS.
  • Enhanced data collection procedures.
  • Processed, cleansed, and verified data integrity.
  • Implemented Data Quality framework using AWS Athena, Snowflake, Airflow and Python.
  • Used PySpark (SQL interface) to convert existing nzPL/SQL ETL pipelines to PySpark.
  • As part of AWS Glue/Kinesis training created end-to-end ETL workflow for Netezza-to-Redshift data migration POC.
  • Worked with business users, collected and documented reqs, brds, frds, tsds.
  • Designed and implemented IOB OATS report for 5 random order families using Python and PL/SQL.
  • Designed and implemented solutions to submit, review, and correct oats FORE files and oats ROE events.
  • Implemented application for review of historical data submissions, corrections and order/executions.
  • Created Contlol-M schedules for one time and recurring execution of OATS workflows.
  • Created multiple Python and PL/SQL APIs for reporting dashboards
  • Participated in modeling sessions, technical design reviews, functional spec reviews, code reviews.
  • Created new, modified existing, and tuned and optimized complex Python, SQL and PL/SQL code parsing and loading vendor XML files.
  • Provided L3 production support of Depot.
  • Created data models for Data Warehousing projects for SMART and GMA reporting platforms. Optimized PL/SQL code, SQL queries, and materialized views for summary and detail MBR and Headcount reports.
  • Created multiple database objects: tables, views, materialized views, indexes, partitions, procedures, packages, sequences, user types.
  • Estimated, analyzed, designed, implemented and tested database projects of different scope. Written design and user project documentation.
  • Optimized, tested existing solutions and implemented new using PL/SQL, SQL and shell (bash). Using PL/SQL and DBMS JOBS parallelized existing PL/SQL ETL routines.
  • Created functions and packages for XML injection and loading using XML Parse, XPath, and XSLT.

Confidential, MOUNTAIN VIEW, CA

DATA (BI) ENGINEER

Responsibilities:

  • I was involved in development and maintenance of Ads Data BI-DWH Data Warehouse.
  • Day-to-day duties included development of new ETL pipelines, monitoring of GETL and its internal components running on Confidential cloud computing infrastructure, job scheduling, log processing, and maintenance of development documentation.
  • Created application configuration and deployment packages using Python, Shell, and Perforce in compliance with corporate standards.
  • Performed system maintenance like space management, archival backup, performance. Written scripts for large files transfer from external systems.
  • Assisted in operational support and application maintenance efforts. Resolved data validation, job preemption quota, connectivity, cloud weather issues.

Confidential, STAMFORD, CT

ORACLE PL/SQL DEVELOPER

Responsibilities:

  • As Oracle PL/SQL Developer, IT FIRC FI DATA team member, I am responsible for design and maintenance of CDR database (Fixed Income Central Data Repository), and Oracle development for transactional and reporting applications and custom ETL solutions.
  • Also analyzed data requirements for FI clients and designed data feeds, Perl parsers, and PL/SQL APIs for number of FIRC FI applications like: FIPB (Fixed Income Primary Brokerage), Credit Arbitrage, Excalibur, and MarginFlex.
  • Trouble-shouted problems and provided workarounds for them. Tuned SQL and PL/SQL to improve database performance.
  • Using C#/.NET created database access layer code for prostrating reporting framework.
  • Created multiple database objects for ETL batch processing and client reporting APIs: tables, sequences, views, materialized views, packages, stored procedures, triggers, synonyms, collections, types, indexes, and Dynamic SQL.
  • Estimated, analyzed, designed, implemented and tested database projects of different scope. Written design and user project documentation.

Confidential, NEWARK, NJ

ORACLE DEVELOPER

Responsibilities:

  • As Oracle Developer I was responsible for design and development of order management application for Confidential and APL( Confidential ) using Python, and Oracle.
  • Responsibilities included requirement gathering, system architecture design, implementation, testing, and deployment.
  • Major accomplishments included a complete refactoring of the existing PL/SQL code base and creation of trade order simulator.
  • Designed and implemented web UI for testing/simulation of APL trade orders using Python and PL/SQL
  • Documented all existing and new code, presentation and business logic.
  • Performed code, and database changes for PL/SQL, and database design for Oracle
  • Fixed bugs, applied patches and participated in release planning.

Confidential, STAMFORD, CT

ORACLE DEVELOPER

Responsibilities:

  • Oracle Developer responsible for development of a new trading platform for structured credit derivatives.
  • Supported existing Structured Credit products Scorpius, and Orion.
  • Analyzed requirements and contributed to data model design.
  • Created multiple database objects for new application: tables, sequences, views, materialized views, PL/SQL packages, stored procedures, triggers, synonyms, collections, types and indexes.
  • Fixed bugs, applied patches and participated in release planning.
  • Tuned SQL and PL/SQL to improve database performance.

We'd love your feedback!