- Performed analysis, design, development, implementation and testing of business application systems in Investment Banking and Manufacturing domains
- 16 years of IT experience in requirement analysis, design and development of database programs on UNIX platform (HP - UX and Solaris) and Windows in Waterfall and Agile methodologies
- 16 years of strong development experience in Oracle PL/SQL - stored procedures, functions, triggers, cursors, packages, collections, exception handling in OLTP and Data Warehouse
- 9.5 years of scripting experience using KornShell (shell scripting) and Perl.
- Experienced in various shell utilities like sed, awk, grep, egrep, fgrep and good understanding of public key cryptography.
- Hands on experience in writing regular expressions (REGEX), using DBI module to connect to Sybase and Oracle databases and usage of subroutines/functions.
- 16 years of experience on SQL Navigator, TOAD, SQL Developer, Rapid SQL, SQL*PLUS utility, SQL*LOADER, vi editor, PuTTY, sed, awk, cURL.
- 3 years of hands on experience in developing Informatica Workflow and transformation.
- Basic understanding of IBM WebSphere MQ and MQ objects on Solaris SPARC platform
- Created project relevant documents like software requirement specifications, low level design, unit test plans, system test plans, checklists for coding and reviewing, coding standards
- 1.5 years of experience on SmartStream TLM and RIMME for FOBO T0/T+1 reconciliation with extensive knowledge on backend static and dynamic tables pertaining to TLM.
- Performed feed loads into TLM and matching initiation tasks.
- Fulfilled user queries on open break investigation, user audit trails, reconciled data extraction and data filter-in/filter out investigation.
Languages: Oracle PL/SQL, Oracle, and Sybase SQL
Scripting Languages: Shell scripting, Perl scripting
Domain: Finance (Investment Banking, Retirement Solutions) and Manufacturing
Web Technologies: HTML
Operating Systems: RHEL Linux, Solaris SPARC, HP - UX, Windows OS
Databases: Oracle 12c, 11g, 10g and 9i, Sybase ASE 15, Sybase IQ
Protocols: SMTP, SFTP, FTP, FTPS, NDM
Database: SQL*PLUS, SQL Navigator, SQL Developer, TOAD, SQL*LOADER, LSNRCTL, Rapid SQL, Interactive SQL
UNIX/LINUX: awk, sed, cURL
Job Scheduler: Autosys and CRONTAB
Event Monitoring: RightIT/RiverMuse and Geneos Netprobe
Reconciliation: SmartStream TLM and RIMME (Rapid In-Memory Matching Engine)
Deployment: CA Technologies Automic
ETL: Informatica Power Center, Talend
Agile: Broadcom Rally Services, Basic understanding of Web Services
Standards: Public-Key Encryption
Confidential, Washington, DC
Database Developer (Consultant)
- Performing system analysis, data mapping, data analysis or programming, using a variety of tools and procedures.
- Analyzing and mining business data to identify patterns and correlations among the various data points.
- Troubleshooting ETL load failures in production database by analyzing the data between the source and the target systems.
- Creating and maintaining materialized views, packages and stand-alone procedures/functions in-order to process data between OLTP and OLAP databases.
- Developing PLSQL procedures to load data into data warehouse FACT tables and writing ad hoc SQL queries.
- Peer testing Talend ETL workflow to document the behavior of data load and transformation into the data warehouse Dimension tables (both type1 and type2 Slowly Changing Dimensions).
- Understand and translate business needs into data models supporting long-term solutions.
- Mapping and tracing data from system to system in order to solve a given business or system problem.
- Implementation of corporate data warehousing activities. Program and configure warehouses of database information and provide support to warehouse users.
Environment: Oracle 12c, Talend 7, SQL *PLUS, SQL Developer, Broadcom Rally
Confidential, New York City, NY
Operations IT Consultant
- Providing technical support to two customized reconciliation platforms - TLM and RIMME. Extensively worked on recalling and reloading of feed in TLM when an erroneous feed was supplied by the source system. Performed manual matching initiation upon user requests to correct the reconciliation in Cash, Trade and Position instances for a given statement date.
- Extensive technical support to Operation users in India and the US on queries pertaining to reconciled data extraction requests, investigation of false breaks, finding historic matched data and investigation into missing statement/ledger side entries. Proficient in querying backend tables like bank, item, message header, message feed, agnt, fils, vque, workflow queue, workflow queue log. Proficient in home grown “gg” utility which could display feed and rec details in the Linux command prompt.
- Maintained the underlying shell and perl scripts which were used in the pre-process, load, post -process jobs as part of the feed load.
- Handling of Autosys batch jobs for post reconciliation reporting pertaining to trade, positions and cash. Investigate job failures by sifting through the corresponding shell script and its log file. Investigate ETL failures by checking the workflow and session logs.
- Upon investigating the root cause, provided the resolution or worked with the level 3 team by escalating the issue by sharing the initial technical analysis.
- Providing technical analysis and resolution on business issues raised by middle office users pertaining to the Global Middle Office system (SMO). Issues pertaining to trade flow, incorrect dirty price on bonds, static issues (books and accounts), trade exception handling and trade ack/nack are handled.
- Working on user queries pertaining to message flow issues between SMO and GLOSS in-house settlement systems.
- Working extensively with vendor Broadridge on issues and requests pertaining to both Equities and Fixed Income products. Extensive knowledge of the real time flow between Broadridge and Confidential for Equities. Maintained the Transaction Service Bus (TSB) interface and NDM interface on Confidential side for outgoing transactions to Broadridge.
- Working on user queries pertaining to BPS/BPSA systems in Broadridge. Having extensive knowledge on BPSA real-time and batch tables to support user queries including data extraction, trade flow issues between Broadridge and Confidential . Extracting archived BPSA data by querying the in-house data warehouse system called CODD.
- Investigate issues reported by downstream systems in case of missing trade/position data by checking ETL workflow mapping in the staging, ITF and DWH layers in the CODD system.
- Working extensively with Broadridge on the IMPACT platform for Fixed Income product. Handling the COGNOS reporting for post batch completion and manually toggling on/off the COGNOS reports based on any business events (Factor’s Night batch delay, Triparty repo delays).
- Handling incident management and problem management processes to support production issues (Severity 4,3,2 and 1 incidents). Working closely with vendor Broadridge for Root Cause Analysis and Incident resolution in both Equities and Fixed Income worlds.
- Handling code deployment via CA Technologies Automic tool. Use of sail, promote and anchor for deployments pertaining to SMO and GLOSS.
- Extensive knowledge on setting up monitoring events via RightIT/RMS monitoring tool. Basic experience in using SPLUNK to retrieve system related metrics.
- Working closely with teams in UK and Japan by taking part in daily hand-over calls to take-over/hand-off ongoing incidents and urgent user queries as part of the “Follow the Sun” model.
Environment: Oracle 12c, Sybase ASE and IQ 15, Bash and KSH on RHEL 6 Linux, Citrix, Solaris SPARC 10, ETL Informatica 10
Confidential, Jersey City, NJ
Programmer Analyst (Oracle PL/SQL Lead)
- Develop various parser programs using Shell and Perl scripts to parse transaction and position reports and to load them into the database via SQL*LOADER.
- Perform technical review of PL/SQL objects and shell wrapper scripts.
- Develop/maintain workflow and transformation using Informatica PowerCenter to load US equity data from Broadridge to the internal data warehouse.
- Develop shell scripts for invoking Oracle stored procedures, functions, packages and complex SELECT statements using the SQLPLUS utility to extract US equity data from Broadridge’s BPSA tables.
- Develop and maintain UNIX shell programs for OATS (Order Audit Trail System) reporting to FINRA.
- Develop and maintain perl programs for parsing various formats of files (TRAC reports from Broadridge, SIAC files) using Perl REGEX and loading into Oracle tables via DBI module.
- Establish connectivity between systems using public key encryption (ssh and SFTP) and develop shell scripts for automated file transfer.
- Involve in SQL query tuning, assessing slow running SQLs by viewing AWR report and recommending creation of SQL profiles and plan baselines to help the CBO pick the cost-effective plan.
- Used SQL TRACE along with EXPLAIN PLAN to troubleshoot slow running SQLs.
- Provide support for analysis and root cause investigation during production incidents.
- Provide release instructions and work with production support staff during deployment.
Environment: Oracle 188.8.131.52, PL/SQL, KornShell on Solaris SPARC 10, perl on Solaris SPARC 10
Oracle PL/SQL Lead
- Develop PL/SQL programs to capture fails and settlement data as part of 15C3-5 reporting.
- Analysis, design and migrate all FTP based shell and perl programs to SFTP based connections.
- Develop shell scripts to process text files as part of Dodd-Frank regulatory program.
- Develop regular expression based pattern matching to parse main frame based trac reports from BPS (Broadridge).
- Develop reports to display positions, dividends, book-keeping, journals, account history and transaction data based on Broadridge’s BPSA tables.
- Carry out code review of shell, perl scripts, PL/SQL programs and SQL joins, subqueries.
- Develop perl program to process OATS process - file submission and parsing of status files back from FINRA via Perl REGEX.
- Provide level 3 support for resolving incidents involving PL/SQL programs, UNIX and perl scripts.
Environment: Oracle 184.108.40.206, PL/SQL, KornShell on Solaris SPARC 10, perl on Solaris SPARC 10
Senior Oracle PL/SQL Developer
- Enhance PL/SQL stored procedures, functions and packages to customize product and sales order feature.
- Build SQL queries using joins and sub-queries to assist business users in ad-hoc reporting of the product store.
- Analyze and fix data issues with downstream systems.
- Responsible for requirement gathering for the back-end enhancements.
- Support production issues by analyzing and fixing SQL query issues.
Environment: Oracle 10g on HP-UX, PL/SQL
Oracle PL/SQL Developer
- Responsible for conversion of procedures and functions in Interbase to Oracle 9i as part of Oracle migration.
- Responsible for mapping datatype compatibility between Interbase and Oracle.
- Involve in unit testing of migrated database objects.
- Introduce automation by creating shell scripts to handle daily, weekly and monthly reporting requirements for the worksite manager.
- Responsible for requirement gathering for database enhancements following the migration to Oracle.
- Providing post production support to remediate performance issues in SQL query.
Environment: Oracle 9i, PL/SQL, KornShell on Solaris