Sr. Database Developer Resume
Summary
- 12 years of Professional experience working in the capacity of Architect, Performance Tuning Specialist, Sr. Database Developer Analyst and SQL Engineer. Participated in all phases of development, functional and design specification, coding, testing, bugs fixing, reviewing and production support.
- Sybase ASE, Sybase IQ, T-SQL, MS SQL Server, Oracle, PL-SQL, DB-Artisan, Rapid SQL.
- UNIX Shell Scripting (Korn Shell) and PERL scripting.
- Relational and Dimensional Data Modeling.
- Database Tool development (Data archival, Intelligent re-org, Database compare).
- Delivery Management.
Area of expertise includes:
- Database Development and General Administration
- Extensive Experience in Database Optimization and Performance Tuning
- {SQL | Database | Server} tuning, Index {Design | Restructuring | Optimization}, Tempdb optimization, Cache partitioning
- Capacity planning, Handling large databases, Data {ingestion | cleansing | massaging}, Self-healing programs, intra-query parallelism.
- Database Design / Modeling / Migration
- Extensively developed Stored Procedures and Triggers, data feeds to downstream systems.
- Automated Backend process through Shell / Perl Scripting, extensively optimized Batch Processes
- System Analysis, Data Modeling and Volume test
TechnicalSkills:
- Database: Sybase ASE 15.0.2/12.5.3/12.5.0.3/12.0/11.9.2/11.5.1/10, Sybase IQ 12.6,
SQL Server 2000, Oracle 10g/8, MS-Access 2000/97 - Database Tools: DB-Artisan 8.5.5/8.0.1, Rapid SQL 7.4/5.7, Sybase Central 4.1.1, SQL-Advantage 3.0,Cast SQL Builder 4.3, BCP, I-SQL, APT 5.3, DWB 5.2, SQL-EM, DTS, Query Analyzer, Profiler, Optdiag, sp_sysmon, sp_monitor, sp_object_stats
- Languages: C, C++, PERL 5.0, VB 6.0, XML, T-SQL, PL/SQL, Power Builder 7.0
- ETL: Informatica, Perl
- Operating System: Sun Solaris 10/8/7, HP-UX 11.0/10.0, Windows NT/XP/2000/98/95
- Data Modeling Tool: Erwin, Power Designer
- SCM: ClearCase, CVS, Opsware, SCCS, PVCS merant 6.8, VSS
- Batch Scheduling: Control – M, Autosys, cron
- Others: Mantas Trading Compliance 4.2, Platform Symphony, CBB Grid Environment,Crystal reports 9.0, Visio 2000, Version one.
Education:
Masters in Software Technology and Systems Management (Confidential)PGDCA with C++, Data structures, System Analysis and Design, RDBMS
BS from Confidential. University
Certification:
Certification in Sybase Administration
Certification in SQL Server
PROFESSIONAL EXPERIENCE
Architect – Sr. Database Developer, Confidential Jul 2009 to Present
Project: GPT/SPARTA
Technical: Sybase 15.0.2/12.5.3, Oracle 10g, DB- Artisan 8.5.5, T-SQL, I-SQL, BCP, Stored procedures, Indexes, Solaris 10/7, Informatica, Perl, Korn shell scripting.
GPT (runs on Sybase) and SPARTA (runs on Oracle) applications are used by product controllers primarily to check the price variance between internal prices and external prices from sources like Reuters, Bloomberg, and GMI. An automated batch process is run daily to process the feeds from America, Europe and Asia Pacific Regions. Batch processes checks for data validation, external price selection and calculates external market value and P&L difference and upload the changes to the GPT/SPARTA databases and Fair value extracts are sent to downstream systems.
Responsibilities:
- Upgraded Sybase 12.5.3 to Sybase 15.0.2. Responsible for testing batch jobs for Americas, Europe and Asia Pacific Regions.
- Optimized the batch cycle run time from 7 hours to 4.5 hours.
- Optimized price testing, data upload and mapping modules for product controllers. There is 65% improvement in the run time from 40 minutes to 14 minutes.
- Developed/optimized stored procedures for fair value accounting, a feed to downstream.
- Implemented HUGO product code expansion (data type change), a firm wide effort.
- Rebuilt highly fragmented indexes (which get fragmented fast due to the nature of the application) on the development databases and convinced Manager that application needs a periodic index rebuild (on most actively used indexes) and coordinated production implementation with the DBA team. Recovered 35 GB of free space from index rebuilt and thus weekly database maintenance improved by 2 hours.
- Worked on database violations generated due to objects having public roles in production databases as part of SOX compliance. Analyzed current roles and permissions and wrote unix scripts for revoking public access from Americas, Europe and APAC region databases and ensured that all database objects have appropriate permissions.
- Played a key role in masking highly confidential data.
- Worked as a SCM within the team, collecting release files from the developers, consolidating and preparing for the release. Coordinated the release with the DBA’s and SA’s and ensured that database objects/ unix scripts have been deployed correctly.
- Participated in DR/QMW tests and performed checkouts to ensure that price testing applications are working as expected.
- Created fair value data feeds for the downstream systems using informatica.
- Extensively worked on data archival and purging.
- On going GPT/SPARTA production support.
Architect
- Architect the development environment for offshore team and handled critical issues such as data masking of sensitive information (e.g. counter party).
- Integration of Business rules into Database, Drawing E-R diagrams and mapping the diagram to tables, Normalization and de-normalization (for optimization).
- Data Partitioning, Capacity Planning, System Architecture, business and systems metrics.
- Implemented DOL schema for high traffic tables to minimize deadlocks/ blocks and improve concurrency.
- Index design/{re}built/tuning/covering.
Tool(s) - Data Archival
- Developed a versatile data-archival tool, which archives data (3+ months old data) to an archival database and deletes from the real time database over the weekend as a part of the weekly database maintenance schedule. 2 years old data is deleted from the archival database based on company retention policy.
- Added functionality to ensure that it does not fail on log space. A threshold is fired to clear the log to ensure that data archival/purging runs smoothly.
- An automatic ‘update statistics’ will run after the table’s data is deleted to ensure that distributions of key values have been updated.
- A detailed history of the purge process is logged into a log table.
Lead Database Developer/Performance tuning specialist, Confidential Apr 2008 to Jun 2009
Project: Optimization of SAMPRAS (Simulations Across Multiple Paths Recalculations And Sorting) Batch Processing
Technical: Sybase 12.5.3, Sybase Central 4.1, T-SQL, C++, Solaris, Korn shell scripting, ClearCase, Platform Symphony, CBB Grid Environment.
Sampras is the principal credit calculation engine (Based on Monte Carlo Simulations) for the firm’s OTC Derivative trading portfolio, with 95% (2 million+ transactions) of the firm’s total external derivative population in a Grid Environment. Sampras re-calculates the credit risk reserve requirement for derivative trading – The Credit Valuation Adjustment calculation (CVA). This project was intended for increasing the performance and scalability of the system. The system has evolved over years and had gone into many re-architecting having over 20 years of existence. SLA’s were missed mainly because of Bear Stearns acquisition and continually increased volume. Optimization of the SQL queries, c++ code and re-structuring the batch was identified as a key to tackle the situation.
Optimization of SAMPRAS batch processing timings & DQ improvements resulted into CVA reports available to business much earlier than defined SLAs by overall resiliency effort. Exposures were delivered by 5AM (SLA 6AM) and P&L explained was delivered by 9AM (SLA 11AM) enabling business to make faster and better decisions.
Responsibilities:
- Identified and optimized SQL statements in stored procedures/reports, which were pain points. There was a 35% improvement in the critical path stored procedures and 53% improvements in the reports.
- Identified circular indices and deleted redundant indices; restructured indexes with very poor selectivity.
- Converted clustered indexes on temp tables in stored procedure & report SQL’s to non-clustered indexes.
- Changed SARGS to accommodate query optimizers to use the best query plan using indices.
- Reengineered batch steps to make them run in parallel.
- Recommended management to increase the number of engines from 16 to 22 & cache by 20GB and played an instrumental role in implementation.
- Added ‘lock table’ statement& changed high traffic table from APL to DOL (identified by sp_object_stats) to tackle deadlock/block issues.
- Recommended DBA to increase user log cache (ULC) from 2k to 8K (based on the observation that transactions started taking upto 14 minutes while doing a commit). Change was implemented by DBAs and transaction committed at a normal pace.
- Suggested multiple tempdb as a solution to heavy usage of tempdb & frequent blockage on system tables. It was implemented by using 2 tempdbs initially and later promoted to 3 tempdbs.
- Partitioned key tables for performance improvement.
- Optimized stored procedures for Basel Allocation module.
- Created a script to report processes blocked for more than 2 minutes.
- Fine tuned the logic to allocate CPUs to various tranches based on number of paths/deals for load balancing as follows:
Ct = CPU * Pt * power (Dt, 1.5)/ sum (power (Dt, 1.5)) * Pt)
The input data for this calculation is:
Number of CPUs available = CPU,
Number of paths per tranch = Pt,
Number of deals per tranch = Dt
The output data is: Number of CPUs per tranch = Ct
In the final allocation, the CPUs left over from the calculations (because the initial calculation uses
an integer) are reallocated to the tranches with the most paths.
Sr. Developer/Performance tuning specialist, Confidential Feb 2007 to Mar 2008
Project: Optimization of Global Sales System (GSS) Batch Processing
Technical: Sybase 12.5.3, Power Designer 11.0, DB- Artisan 8.0.1, Rapid SQL 7.4, SQL Server 2000, T-SQL, CT-Library, Optdiag, Perl, Informatica, Solaris, Autosys, BCP, DTS, Tibco, CVS, Opsware.
Global Sales Systems (GSS) is the Global Source for all Institutional Sales, Private Client trade and revenue data. It leverages the inter-relationships between Trade, Customer, Salesperson and Product and provides the base for multiple applications and portals to reside on. This project was intended for identifying flaws in the logical construction of the code, which had been adversely affecting the speed & concurrency and SLA’s were missed too frequent. (A few more frequent reasons were ‘continually increasing feeds volume’, ‘deadlocks’ and sometimes late feed from upstreams). Optimization of the code was identified as a key to tackle the situation as a tactical measure and utilizing Informatica as an ETL tool as a more strategic measure.
Optimization of GSS batch processing improved data availability times for global users. Business benefits vary from providing information about clients and profitability in an efficient manner, which allowed for better decision making about which clients to focus on, as well as providing a clearer measure of salesperson\'s performance.
Responsibilities:
- Identified SQL statements in stored procedures/reports SQLs that were pain points. In order to improve ‘concurrency & throughput’altered SQL statements appropriately without disturbing the integration and the purpose of the module.
- Determined unnecessary indices, dropped/restructured indexes with very poor selectivity.
- Designed and created new indexes (on temporary and permanent tables) as required.
- Redesigned transactions and implemented DOL schema for high traffic tables to minimize blocks/deadlocks and improve concurrency. A 40% performance gain was observed on the processes involved. This also helped in recouping 32% disk space.
- Changed SARGS to accommodate query optimizers to use the best query plan using indices.
- Modified specific stored procedures (producing highly volatile data) with ‘with recompile’ to avoid using a cached procedure plan.
- Recompiled procedures to use the distribution data of indexes in order to use the most optimized query plan.
- Recommended a more efficient ‘update statistics’ & weekend database maintenance plan to management and worked as a liaison with the DBA team to implement it.
- Played an instrumental role in adding 8 more CPU’s to the dataserver.
- Evaluated queries resource usage pattern (CPU-bound or I/O-bound) and implemented intra-query parallelism.
- Created 4KB buffer pools and implemented 4KB logging with sp_logiosize.
- Created data filter criteria & mappings for various transformations like ‘Credit Calculation’,’ numeric conversion’.
- Analyzed database growth trends and planned for future disk, memory, and CPU needs; made recommendations to management.
- Developed an intelligent reorg tool.
- Created a batch job to load security ratings into the GSS.
- Developed Self-healing scripts to minimize DBA’s intervention.
- A key production support member and coordinated with offshore team.
- Coordinated with SA’s for a largefile safe patch for receiving feed files (from upstream) bigger than 2GB.
- Created data filter criteria & mappings for transformations like ‘Credit Calculation’, ‘numeric conversion’ based on positioning using Informatica.
Sr. Sybase ASE/IQ Developer, Confidential Mar 2005 to Dec 2006
Project: Compliance surveillance system
Technical: Sybase ASE 12.5.0.3, Sybase IQ 12.6, Rapid SQL 5.7, Sybase Central 4.1, SQL Advantage 3.0, Mantas 4.2 (Trading Compliance), Hyperion Performance Suite 8.3, Brio, SQR, Aqua studio 4.0, T-SQL, CT-Library, Sun Solaris, C++, Perl, Autosys, BCP, Visio 2003, SCCS.
The Market Surveillance Data Warehouse (MSDW) is designed as a central repository for the Firm’s order, trading, and execution data. This data is used by Compliance primarily to generate surveillance reports and alerts based on industry rules to bring potential issues to the attention of Compliance Analysts.
Responsibilities:
Sybase ASE 12.5.3 to IQ 12.6 Migration
- Involved in performing feasibility analysis and studying the existing system.
- Created tables, Indexes, procedures and views on IQ.
- Worked with DBA\'s to create users and groups on IQ and granting appropriate permissions.
- Wrote load scripts to migrate data from ASE to IQ. Conducted performance analysis of the load.
- Rewrote the application logic, reporting queries from ASE to IQ
- Wrote Unix Scripts for migration and scheduled jobs in Autosys.
Mantas Trading Compliance
- Mapped firms order/trading/execution data obtained from various order and trade processing systems.
- Created Perl scripts to validate Trade and Market data.
- Configured scenarios in Mantas.
- Wrote scripts to massage data and feed to Sybase IQ/Oracle databases for alert generation and automated in batch cycle using autosys.
MSDB Production Support & Maintenance
- Designed tables, wrote stored procedures to setup feed (Included New, Cancelled and Corrected Orders) for Order Audit and Trailing System (OATS) Regulatory Reporting for NASD traded securities.
- Provided nightly batch support that includes monitoring, diagnose and resolve issues (e.g. Bad feed | File no show/Delayed | Deadlocks) to ensure that batch cycle finish at a decent time and SLA’s are met.
- Analyzed data for business users and answer specific questions.
- Developed stored procedures for application enhancement.
- Created operational, infrastructure and data architecture worksheet for current environment.
- Participated in DR tests and performed checkouts to ensure that applications are working as expected.
- Developed a unix script for deleting log/temporary files older than 30 days and scheduled the job through autosys.
Sr. Developer/Analyst, Confidential Jan 2003 to Feb 2005
Project: FRAME (Financial Reporting and Management Expectations) Globalization
Technical: SQL Server 2000, Procedures, Functions, Triggers, T-SQL, Windows 2000, Power Builder 7.0, Erwin 3.2, BCP, DTS, VBScript, Visio 2000, VSS.
Financial Reporting and Management Expectations (FRAME), is the global solution to reserving at Converium. Actuaries, Underwriters, Accountants and Senior Management use FRAME to calculate reserve on a quarterly basis and as a daily management-reporting tool to review and make adjustments to the reserve. As a Sr. Developer I was responsible for creating Stored Procedures, Triggers and rules forproperty and casualty (P&C) application, writing complex ad hoc SQL queries, index design/creation/tuning.
Responsibilities:
T-SQL programming
- Developed user interface using power builder.
- Extensively developed Procedures, Functions, Triggers, Constraints, Rules and UDFs.
- Wrote triggers for the key database tables to ensure data consistency/integrity.
- Created quick ad hoc SQL queries and turn over results to users using T-SQL.
- Designed and created indexes. Also created covered indexes for optimization.
- Created/Maintained scripts to automatically drop indexes, perform data loads and recreate indexes after load is over to enable fast bcp for volume-sensitive feeds.
- Used excel extensively for parsing data.
Developer, Confidential Mar 2001 to Dec 2002
Project: Systematic Provider Identification and Electronic Repricing (SPIDER)
Technical: Sybase 12.0, Sybase Central, SQL Server 2000, Query analyzer, SQL-EM, Procedures, Functions, Triggers, DTS, T-SQL, Rapid SQL, DB-Library, VBScript, VBA, BCP, Windows 2000, Sun Solaris 7, C++, Power Builder 7.0, Erwin 3.2, Crystal reports, PVCS.
Spider is a mission critical system, developed specifically to assist with claims adjudication and payment for defined contribution plans in the field of consumer-driven health care. Finance Managers, Customer Service advocates, Enrollment representatives and Claim Processors use SPIDER for claims adjudication and payment. As a Developer, I was responsible for developing extensive storedprocedures for HIPAA implementation, DTS packages, data consolidation, Performance tuning, index design/creation/tuning, Maintenance and monitoring of mission critical applications.
Responsibilities:
Programming/Production Support
- Developed and scheduled DTS packages for critical modules like Claim loader, Claim repricer, Positive pay, Claim import and Claim export.
- Developed extensive stored procedures for critical modules like Claim Waterfall, Claim Payment, positive pay and claim adjustment.
- Provided day-to-day database support including monitoring, troubleshooting and resolving deadlock issues.
- Developed Functional/Technical Documentation.
Developer/DBA, Confidential Aug 1999 to Feb 2001
Projects: Integrated Plant System (IPS) / Manufacturing Cost Control (MCC)
Technical: Sybase 11.5.1/Sybase 11, HP-UX 10.0, T-SQL, C, APT 5.2, RPT 5.2, Perl.
Responsibilities:
Programming/Production Support
- Closely worked with business/users/chemists for requirement gathering.
- Developed various modules like monthly production plan, quality control and product dispatch.
- Created databases, users. Migrated Integrated Plant System (IPS) from Sybase 11.0 to Sybase 11.5.1.
- Created scripts for automated Database Health Check (DHC) using DBCC.
- Designed and created indexes for improving performance.
- Optimized server parameters using sp_configure that consume substantial memory.
- Developed stored procedures for handling database ‘logsegment’ threshold events.
- Created ad hoc SQL queries and provided output to the users.
- Wrote scripts extensively for data parsing using sed & awk.
- A key production support member in a fast paced environment.
Performance Tuning
- Created performance tuning baseline spreadsheet for mission critical stored procedures.
- Invoked sp_sysmon to probe & monitor system performance under normal and load conditions.
- Sampled and analyzed sp_sysmon output on regular intervaland tuned various server parameters to maximize performance.
- Identified tables needing DOL schema with sp_object_stats and implemented schema changes for better concurrency.
- Used larger packet size (8192 bytes) for large bulk-copy operations and configured parameters like “max network packet size” & “additional network memory”.
- Minimized tempdb contention.
T-SQL Tuning
- Flattened sub queries into a normal join, Promoted ‘Update in Place’ design wherever possible.
- Created indexes, dropped/restructured indexes with poor selectivity based on index usage report.
- Forced Indexes, by bypassing query analyzer in certain typical conditions.
- Modified specific stored procedures (producing highly volatile data) with ‘with recompile’.
- Minimized deadlock by redesigning transactions; spinlock by creating named caches.