We provide IT Staff Augmentation Services!

Sr. Developer(sybase/unix/perl) Resume

5.00/5 (Submit Your Rating)

Summary

  • Sr. Sybase/Perl/ETL Developer with 7.5 yrs of strong experience in ArchitectureDesignDevelopment of DB ProjectsMaintenance & Enhancement of Application with strong analytical & development skills seeking assignments in Software Development & Leadership in the IT sector.
  • Experience in Fixed Income Securities involving Finance & Banking, Credit Rating & Risk Analysis. Experience in Investment Banking Domain.
  • Programming experience in Sybase (ASE), T-SQL & ANSI SQL, Shell Scripting, Perl Scripting.
  • Experienced in creation Stored Procedures, Triggers, Indexes, Views, and Tables, Schemas for large database systems.
  • Extensive enperence in DB Tuning of high data volume applications. (Procedures, System Design, Triggers, Replication, Deadlock & Cache Tuning)
  • Experience in Sybase Data Modelling.
  • Programmed extensively in Perl scripting to handle AutomationFeed Loads, Data Consistency Fix, Extract & Load Data, Client Reporting using DBI/DBD modules & writing varioussubroutines.
  • Conceptual knowledge on the object oriented programming using Perl. Ability to develop OOPs programs.
  • Experience in Control M , Cron Tab & Autosys.
  • Experienceinperforming Data Analysis & Data Migration.
  • Experience in Extract, Transform and Load (ETL) of data using Informatica for various sources - xml, flat files, relational. Experience in tuning the transform and load process.
  • Experience in developing transformations, mappings, creating & scheduling workflows using Informatica. Experience in using Joiner,update strategy,sorter,router,lookup & other transformations.
  • Experience with Informatica Adminstrator ConsoleRepository & Integration Service Creation & user permission/roles assignments.
  • Experience in SQL Server development and unit testing.
  • Basic knowledge of Oracle Database Server environment.
  • Quick learner and a good team player with ability to prioritize tasks and meet deadlines
  • Flexibility, strong analytical, hard working & fast learning ability are key strengths.

Technical Skills

Programming Languages : T-SQL, ANSI-SQL, UNIX Shell Script (K shell), C, Perl Scripts
Database Servers : Sybase 12.x, 15.x MS SQL Server 2000, 2005, MS Access,  Informatica 8.6.1(ETL), Oracle
Processes/Job Technology : DTS, Control M, XML, HTML, FTP, SFTP, FTPS, SSH
Operating Systems : UNIX (IBM AIX 5.1, 6.1), Windows XP/2000/VISTA.
Office Applications & Tools : MS Office, Visio, Cron Tab,VI editor,Core FTP, SSH, BCP,  DB Artisan, Rapid Sql, Putty, Exceed, CA ERwin  Microsoft Source Integrity, Hummingbird FTP,SQL Developer

Project Details

1. Confidential (Chicago -USA)

Domain:Investment Banking

Client:Confidential

Role:Sr. Developer(Sybase/Unix/Perl)

Duration:Jun 2012 – Dec 2012

Project Description:

SPYDA - Dod Frank Jun 2012 – Dec 2012
Spyda application generates valuation trading statements & sends them to various clients. Dod Frank rule was required to be implemented and due to significant increase in statement volumes system needs to be made scalable to handle those volumes. Data is fed to spyda system through internal UBS feeds and using unix/perl scripts data is loaded into database by transforming it using DB procedures.

Responsibilities:

  • Studied the business requirement in detail for Dod Frank initiative and implemented them.
  • Understood flow of Spyda application with help of existing development team.
  • Created high level design and development document to implement the change.
  • Implemented the change by creating various table schemas, constraints, indexes, keys, developed& modified stored procedure, triggers to working with Dev team. Performed Code Review being senior member of team.
  • Made the Spyda system scalable to handle 500-600% more volumes in less time by tuning required areas. This involved rewriting stored procedure code, design changes, server cache tuning, RAM addition.
  • Using UBS internal DB tools like DB Watch, DPRS DB etc.
  • Involved in data analysis & creation of Perl Script for Timely purging of data.
  • Created DB objects deployment script using Perl scripting.
  • Created adhoc jobs for update statistics and reorg rebuild using Perl scripts &various Perl modules like DBI/DBD.
  • Validating the test results after repetitive feed load by writing automated Perl scripts & Subroutine.

Environment : Sybase 15.0, Unix, Perl Scripting, DB Artisan, Beyond Compare, puttytel, DB watch, DPRS

2. Confidential (New York-USA)

Domain:Fixed Income Securities, Credit Rating & Risk Analysis

Client:Confidential

Role:Sr. Developer(Sybase/Unix/Perl/ETL)

Duration:July 2005 – May 2012

Project Description:

Confidentials Investors Service is among the world’s most respected, widely utilized sources for credit ratings, research and risk analysis for Fixed Income Securities. The firm publishes market-leading credit opinions, deal research (ABS/RMBS deals), performance data and commentary that reach more than 5,000 institutions and 30,000 subscribers around the globe. It deals with Public, Corporate and Structured Securities.

Rating/Performance Delivery Services (PDS) Sept 2010 – May 2012 
PDS application was developed to provide the business functionality to user for displaying the Performance data for structured finance fixed income securities & sending the rating & data alerts. Performance Data is loaded into database using the ETL technology by applying various business validations in the transformations. Source for the data was relational databases - upstream Structured finance system.

Responsibilities:

  • Enhanced, developed & supported the PDS ETL load system and UI DB components and Perl jobs.
  • Build the Data model for few parts of the application used on UI side as a part of enhancements project.
  • Developed views, triggers, stored procedures, tablesrules, defaults, constraints, keys.
  • Developed lookup tablemechanismsto store field mappingson pool, tranche & deal level.
  • Worked with Update strategy, Joiner, lookup, router, sorter, aggregator, Expression, stored procedure, filter etc types of transformations to create mappings for transforming PDS data.
  • Created workflows and scheduled them.
  • Worked with Administrator Console to create Integration Service & Repositories for Development & Test servers. Assigned Permissions & roles to users.
  • Developed/modified many procedures to populate the SQL server from Sybase as per business needs.
  • Developed performance data alert & rating alert mechanism using stored procs and control M jobs.
  • Worked on dynamic sql procedures for the advanced search mechanism for PDS deals.
  • Created and modified replication definitions for replication server.
  • Analysed production issues and implemented fixes using various methods.
  • Wrote Perl, Shell scripts for generating various reports & adhoc data fix jobs.
  • Worked on improving replication performance & trigger performance. Correcting rep definitions.
  • Created DB Reindexing jobs & data purging using Perl scripts taking care of replication issues.
  • Index analysis using MDA tables and their removal if required.

Confidential: Jan 2010 – Aug 2010 
Moody’s had to comply few compliance and few regulatory indicators in the corporate finance applications. Project was to introduce the indicator in the current application.

Responsibilities:

  • Introduced the indicators in system by changing the existing schema of tables, developing stored procedures, triggers, lookup tables to store indicator information.
  • Created many stored procedures, tables, triggers.
  • Developed roll up indicator logic for the roll up ratings on fix income rating class level.
  • Developed various business validation logics based on the requirements for the indicators to be decimated to downstream applications.
  • Replicated the table on other server by checking dependencies & downstream impacts.
  • Developed the toggle OFF/toggle ON mechanism to make all the indicators on and off based on the business decision to publish them.
  • Handled bulk data upload millions of records for indicator data.
  • Wrote the data verification scripts using Perl to ensure the correct data upload
  • Testes, debugged the code. Created test request document for QA.

Confidential Apr 2009 – Dec 2009 
Confidential Advanced search is the search functionality for Collateralized Debt Obligation Deals based on deal name, region, product type, obligor, guarantor, Issuer and many more such criteria including performance field values.

Responsibilities:

  • Build the logical search design which can provide optimized performance for the user since the underlying significant data was the major issue.
  • Analyzed the existing system design and came up with new design model which could deliver better performance.
  • Designed the new data model for search application based on the logical design.
  • Created work tables, views, stored procedures, triggers to meet business & system requirements.
  • Performed unit testing & created test documents and test scripts/procedures for carrying out tests for the testing teams at various stages in the SDLC.

Confidential: Jul 2008 – Mar 2009
Confidential is the Data Feed application used to load the data into database by accessing the external website and pulling the data in the XML format. This Xml data is populated into the databases by Informatica transformation and mapping.

Responsibilities:

  • Gathered User/Business Requirements;
  • Created many tables, stored procedures. Modified few stored proc based on user requirements.
  • Created/modified various ETL transformations and mappings to load the xml data into work tables and then apply business logic to load it into the core tables.
  • Worked with Update strategy, Joiner, lookup, router, sorter, aggregator, Expression, stored procedure, filter etc types of transformations to create mappings.
  • Created various workflows and scheduled them.
  • Worked with Administrator Console.
  • Created PERL scripts for some adhoc files to load into the database.
  • Created test documents for carrying out tests for the testing teams Maintaining and reviewing of process documents involved in the change requests.
  • Identifying the bottlenecks of the application and fix those areas.

Data Module Monitoring System & Automation Of Datamodule Data Processing: Jan 2008 – Jun 2008
Datamodule is the Unix/C++ application which transforms the data from the source databases to the target by applying various business logics. DDMS is the application build for support team to monitor the datamodule.

Responsibilities:

  • Created high level design document for the application.
  • Wrote various Perl scripts, subroutines using DBI/DBD module to connect to the db server and data module application to get the status of the threads and synch batch status various other things required for monitoring.
  • Wrote unix shell scripts using awk, sed, cut and may other commands to format data.
  • Created automation scripts using Perl to take the user input for batch processing of data through datamodule. Wrote various perl subroutines for automation of synchbatch data.

Bridge- Erebridge: July 2005 – Dec 2008 
Bridge Erebridge applications are the data mart for many Moodys downstream applications where business level logics are derived and data is stored in derived format. This is a source of moodys.com.

Responsibilities:

  • Worked on Enhancements & maintenance of project. Analyzed the production issues and implemented the fixes.
  • Created and modified stored procedures, triggers, tables, views formaintenance and enhancement of application based on user requirements.
  • Wrote various UNIX shell scripts for generating various reports. Wrote various Perl scripts & converted them tocontrol M jobs to update specific kind of data which was not possible through DB system. Wrote various data transformation unix/perl jobs using various Perl modules and subroutines.
  • Maintaining and reviewing of process documents involved in the change requests.
  • Identifying the bottlenecks of the application and tune those areas and improve the application data concurrency by avoiding the deadlock situations.
  • Performance tuning of the application. Performing DR tests for application.

Environment: Sybase 12.0, 12.5, Perl, Perl Scripting, Unix K shell scripting, SQL Server 2000, ERwin, Rapid SQL, Microsoft Source Integrity, BMC Remedy(OPAS), Beyond Compare, puttytel, Hummingbird FTP, DB Artisan, telnet, vi editor, Data Transformation Services (DTS), Infomatica power center 8.6.1, Apache Tomcat.

Educationnel Qualification:

  • Bachelor Of Engineering (Electronics),

We'd love your feedback!