We provide IT Staff Augmentation Services!

Sr Database Developer Resume

2.00/5 (Submit Your Rating)

NJ

Summary

  • 14 years of Professional experience working in the capacity of Sr. Database Developer Analyst, Performance Tuning Engineer and SQL Engineer. Participated in all phases of software development.
  • Sybase ASE, Sybase IQ, MS SQL Server, Oracle, PL-SQL, T-SQL, DB-Artisan, Rapid SQL.
  • Database Tool development (Data archival, Database compare).
  • UNIX Shell & PERL scripting.

Area of expertise includes:

  • Database Development and General Administration
  • Extensive Experience in Performance Tuning (batch processes)
      • {SQL | Database | Server} tuning, Index {Design | Restructuring | Optimization}, Cache partitioning.
  • Extensively developed Stored Procedures and Triggers, created data feeds, volume test.
  • Automated backend processes through Shell / Perl Scripting.

Technical Skills:

  • Databases: Sybase ASE 15.0.3/12.5.3/12.5.0.3/12.0/11.9.2/11.5.1/10, Sybase IQ 15.2.0/12.6,
    SQL Server 2000, Oracle 10g/8, MS-Access 2000/97
  • Database Tools: DB-Artisan 8.5.5/8.0.1, Rapid SQL 7.4/5.7, Sybase Central 4.1.1, SQL-Advantage 3.0,

ADS 10.0, BCP, I-SQL, APT 5.3, DWB 5.2, SQL-EM, DTS, Query Analyzer, Profiler, Optdiag, sp_sysmon

  • Languages: C, C++, PERL 5.0, XML, T-SQL, PL/SQL
  • ETL: Perl, Informatica
  • Operating Systems: Sun Solaris 10/8/7, HP-UX 11.0/10.0, Windows NT
  • Data Modeling Tools: Erwin, Power Designer
  • SCM: ClearCase, CVS, Opsware, SCCS, PVCS merant 6.8, VSS
  • Batch Scheduling: Autosys, cron, Control - M
  • Others: Mantas Trading Compliance 4.2, Platform Symphony, CBB Grid Environment,

Crystal reports 9.0, Visio 2000.

Education:

GNIIT in Systems Management

Certification:
Certification in Sybase Administration

PROFESSIONAL EXPERIENCE

Sr. Database Developer Aug 2011 to Present
Confidential,
Technical: Sybase 15.0.3, Sybase IQ 15.2.0, ADS 10.0, T-SQL, I-SQL, BCP, Solaris 10, Korn shell scripting, Control - M.

  • Responsibilities:
  • Data Center Migration (MRO to RTP): This project was intended to move ITIS data warehouse from MRO to RTP. Tested batch jobs and certified results. This effort enabled annual savings in maintenance costs, while increasing capacity and disaster recovery capabilities.
  • Added ‘data transmission’ validation check to critical feeds as part of an internal audit requirement.
  • Created a “Last Look Trading” database. Incoming orders are filled internally if matched otherwise order will be re-routed to the third party brokers.
  • Created tables and set up feeds (ASE / IQ) for RouteHub, TwoFour and PreClear.

Sr. Database Developer Jul 2009 to Jun 2011
Confidential,
Technical: Sybase 15.0.3/12.5.3, Oracle 10g, SQL Server 2000, DB- Artisan 8.5.5, T-SQL, I-SQL, BCP, Stored procedures, Solaris 10/7, Perl, Korn shell scripting, Control -M.

GPT (runs on Sybase) and SPARTA (runs on Oracle) applications are used by product controllers primarily to check the price variance between internal prices and external prices from sources like Reuters, Bloomberg and GMI. An automated batch cycle is run daily to process feeds from America, Europe and Asia Pacific Regions. Batch processes check for data validation, external price selection and calculates external market value and P&L difference and upload the changes to the GPT/SPARTA systems.

  • Responsibilities:
  • Implemented HUGO product code expansion (data type change), an enterprise-wide effort.
  • Worked on database violations generated due to objects having public roles in production databases as part of SOX compliance. Analyzed current roles and permissions and wrote unix scripts for revoking public access from Americas, Europe and APAC region databases and ensured that all database objects have appropriate permissions.
  • Optimized price testing, data upload and mapping modules for product controllers. There is 65% improvement in the run time from 40 minutes to 14 minutes.
  • Enhanced stored procedures for fair value accounting.
  • Rebuilt highly fragmented indexes on the development databases and convinced Manager that application needs a periodic index rebuild (on most actively used indexes) and coordinated production implementation with the DBA team. Recovered 35 GB of free space from index rebuilt and thus batch cycle improved by 2.5 hours and weekly database maintenance improved by 2 hours.
  • Extensively worked on data archival and purging.
  • GPT/SPARTA production support.
  • Architect
  • Architect the development environment for offshore team and handled critical issues such as data masking of sensitive information (e.g. counterparty).
  • Implemented DOL schema for high traffic tables to improve concurrency and minimize deadlocks/ blocks.
  • Tool(s) - Data Archival
  • Developed a versatile data-archival tool, which archives data to an archival database over the weekend as a part of the weekly database maintenance schedule.
  • Added functionality to ensure that it does not fail on log space. A threshold is fired to clear the log to ensure that data archival/purging runs smoothly.
  • An automatic ‘update statistics’ will run to ensure that distributions of key values have been updated.
  • A detailed history of the purge process is logged into a log table.

Sr. Developer / Performance Tuning Engineer Apr 2008 to Jun 2009
Confidential,
Technical: Sybase 12.5.3, Sybase Central 4.1, T-SQL, C++, Solaris, Korn shell, Platform Symphony, CBB Grid Environment, ClearCase.

Sampras (Simulations Across Multiple Paths Recalculations And Sorting) is the principal credit calculation engine (Based on Monte Carlo Simulations) for the firm’s OTC Derivative trading portfolio, with 95% (2 million+ transactions) of the firm’s total external derivative population in a Grid Environment. Sampras re-calculates the credit risk reserve requirement for derivative trading – The Credit Valuation Adjustment calculation (CVA).

This project was intended for increasing the performance and scalability of the system. The system has evolved over years and had gone into many re-architecting having over 20 years of existence. SLA’s were missed mainly because of Bear Stearns acquisition and continually increased volume. Optimization of the SQL queries, c++ code and DQ improvements was identified as a key to tackle the situation.

Optimization of SAMPRAS batch & DQ improvements resulted into CVA reports available to business much earlier than defined SLAs by overall resiliency effort. Exposures were delivered by 5AM (SLA 6AM) and P&L explained was delivered by 9AM (SLA 11AM) enabling business to make faster and better decisions.

Responsibilities:

  • Identified and optimized SQL statements in stored procedures and reports. There was a 35% improvement in the critical path stored procedures and 53% improvements in the reports.
  • Converted report SQL’s into stored procedure call to avoid making repeating connections to the database.
  • Converted clustered indexes on temp tables in stored procedure & report SQL’s to non-clustered indexes to improve performance.
  • Recommended DBA team to increase user log cache (ULC) from 2k to 8K (based on the observation that transactions started taking upto 14 minutes while doing a commit). Change was implemented and transactions started committing at normal pace.
  • Optimized stored procedures for Basel Allocation module.
  • Fine tuned the logic to allocate CPUs to various tranches based on number of paths/deals for load balancing as follows:


Ct = CPU * Pt * power (Dt, 1.5)/ sum (power (Dt, 1.5)) * Pt)

The input data for this calculation is:
Number of CPUs available = CPU,
Number of paths per tranch = Pt,
Number of deals per tranch = Dt

The output data is: Number of CPUs per tranch = Ct

In the final allocation, the CPUs left over from the calculations (because the initial calculation uses an integer) are reallocated to the tranches with the most paths.

We'd love your feedback!