Senior Java Developer/senior Consultant/backend Developer Resume
Newport Beach, CA
SUMMARY:
- Over 20 years of extensive experience in the design, development and implementation of financial, telecommunication, retail market industry, and scientific based systems in UNIX, C/C++, Perl, Java, World Wide Web.
- Developed applications in Relational/SQL database systems such as Sybase, Informix, Oracle, DB2; proprietary financial based historical end of day time series and intraday real - time tick databases; recently built C/C++ APIs to access in-memory KDB/Q databases.
- Proficient in UNIX/OS internals, and have worked in local area networks, distributed processing: TCP/UDP/IP Sockets programming, IPC (Inter Process Communication), shared memory, message queues and semaphores.
- Implemented temporal extensions to SQL Language to handle the time dimension in addition to rows/columns of relational model to SQL to be able to query the time dependent data using the compiler construction tools such as YACC/LEX/FLEX/BISON.
- Familiar with vector/functional programming tools, namely Scala, KDB/Q programming languages to develop applications.
- Worked with real-time market data applications that involved distributed processing and built APIs to collect data from derived real-time feeds from source data feeds such as PUB/SUB tick network, NYSE/TAQ, and other tick source data feeds; and also employed multi-threading techniques in developing the same.
- Worked with High Frequency Data (intraday real time) with Low Latency (milli- and microseconds) achieved by providing interfaces via Sybase, Informix (time series data blade), LIM, KDB/Q, FAME, and proprietary time series database implementation in retrieving the market data from Customized Ticker Plant as well Low Frequency Data (end of day) implementations.
- Most recently involved in Software Configuration Management and Change Management and POS Release Management using the Subversion/Team Forge/AnthillPro technologies as well HPALM (HP Application Lifecycle Management).
- Used configuration management tools such as RTC (Rational Team Concert), ClearCase, ClearQuest, Perforce, and CVS.
- Used various design patterns, namely MVC, Observer (PUB/SUB), Client/Server, Producer/Consumer, Master/Slave, and implemented the multi-threaded server to achieve pricing data distribution to business clients.
- Worked on projects involving Regulatory and Compliance procedures in a financial environment.
- Worked on IBM HTTP Server 8.5 (Apache-based webserver) and developed an Apache Module to parse Object Configuration Files and invoke the Application Callback Service routines using rules-based patterns such as URL pattern configurations specified in the Object Configuration file.
- Integrated HTTP Server into IBM WebSphere 7.0/8.5 to process the Servlet/JSP associated with Single Sign On applications.
- Integrated OpenSSL package to establish SSL-based encryption/decryption between web clients and a web server.
- Implemented proprietary systems to handle creating and distributing several terabytes (Big Data) pricing data comprising daily and real-time feeds, namely: end of day pricing time series for Equity Markets and Equity Derivatives, intraday time series for various Exchange Real-time Feeds such as RDMS, NYSE/TAQ, CBOT Options/Futures, OPRA, and LSE feeds.
- Provided C/C++/Java API for pricing analytics, TWAP, VWAP computation Standard Deviation, Moving Averages, and Volatility calculations.
- Experienced with large database management systems and grid computing (MapReduce, etc.), namely Sybase, Informix, and Oracle.
- Familiar with Hadoop distributed database technology which basically uses the Map Reduce functionality and distribution of work load in a clustered node environment.
- Varied experience in writing scripts and build tools in UNIX/Linux to build and deploy software packages, namely SH/KSH/CSH/BASH/Make/Gmake/SED/AWK/PERL and most modern tools, such as Ant, AnthillPro, and Maven.
- Used SOAP and REST based services with XML/SOAP and Apache/HTTP/REST API based frameworks.
- Used IBM MQ Series to establish message communication in Oracle Tuxedo distributed transaction environments.
- Used Python Scripting tool, and have a good grasp of data structures such as sets, lists, dictionaries, threading and understanding of list comprehensions and generator expressions.
- Also focusing on NoSQL databases and knowledge of Python bindings to NoSQL databases, namely MongoDB (Collections Hierarchies), and Cassandra (Column Based framework).
- Used the Spring Frameworks/Java 1.8 and used the Spring Transaction Management/Hibernate/JPA tools.
- Involved in handling the application configuration and control data associated with name and data nodes in the Hadoop Cluster Environment to manipulate and control the configuration in Oracle Database.
- Also worked with JUnit frameworks, as well how to mock the objects for testing the component interfaces in Java technology.
- Monitored the Netcool/Omnibus generated alerts related to the applications for the good health of managing them.
TECHNICAL SKILLS:
Languages: Java, C/C++, PYTHON, SQL, JSON, XML, Visual Age, OS 4690.
Technologies/Tools: UDP/TCP/IP, Multicasting, Sockets Programming, POSIX Threads, Solaris Threads, Ethernet, FTP/ NFS, TELNET, SH/CSH/KSH, PYTHON, Perl, SED/AWK, PERFORCE, CVS, HPALM, YACC/LEX/BISON/FLEX, JSP, JDBC, Servlets, XML (DTDS), XPATH, XPOINTER, XSLT, XML, ANT, ECLIPSE, Message Queues, SEMAPHORES, Shared Memory, HTML, JavaScript, HPQC, ROGUEWAVE/TOOLS++, IBM HTTP Web Server, IBM WebSphere App Server, Rest API, HTTP Protocol, Hibernate, Spring, SCALA, NodeJS, AngularJS, React, Redis, Swagger, GitHub, Docker, YAML, RAML, Knex
Database APIs: PostGres, MySQL, Informix TimeSeries Real - Time Loader, DB2, KDB/Q, OneTick, Oracle, Oracle Call Interface, Sybase DB C-LIBRARY/CT-Library, Sybase Open Server Library, Database Modelling based on E/R concepts and techniques.
Operating Systems: SunOS, Solaris, UNIX System V, Linux (Red Hat, openSUSE), Windows OS, OS 4690.
PROFESSIONAL EXPERIENCE:
Confidential, Newport Beach, CA
Senior Java Developer/Senior Consultant/BackEnd Developer
Responsibilities:- Worked on the Fixed Income Fund Transfers using Python module Flask which provides Micro Services capabilities in a Flask restful framework. The major functionality is to transfer the fixed income fund complexes such as from source accounts to destination accounts in highly scalable and HPC architecture in a transaction environment such as the Oracle Database and to virtual computing environment using Red hat Linux operating system.
- Designed and developed Web API based software modules in a Restful frameworks to communicate with various banking applications
- Used the configuration management tools such as SVN, artifacts based change management (Team Forge), Eclipse IDE/neon, Maven 3.39, ANT, Graddle, Remote Debugging on Linux to aid and perform the complete SDLC operations meeting the requirements and fulfilling their day to day needs.
- Worked on Straight Through Processing - STP to eliminate the human intervention and minimize the settling time of financial security transactions such as fund allocation/distribution of Fixed Income securities and pushed the data into Oracle 11g databases using Java 8/JDBC Templates and XML configuration files.
- Performed Migration of Sybase tables/Transact SQL implementations of STP to use the counter party Oracle/PLSQL 11g technology infrastructure using the JAVA 8, Spring Frameworks/v4.1.9, multithreading and multiprocessing concurrent features. Heavily used the Spring Frameworks in defining and using the Spring Transaction management/Spring Beans/JDBC Templates to persist the data in Oracle Database.
- Worked on Trade Flow daemon implemented in JAVA 8 technology. The Trade Flow Daemon is an ETL process which extracts, transforms, and load the BBG feeds data connecting to the Bloomberg API feeds which collects the information from TCP/UDP based networks creating XML documents by disseminating the ASCII text based records coming from Bloomberg feeds and persisting the same into Oracle database for further streaming processing via STP application which involved the implementation of JAVA ETL processes to manipulate the data to/from Oracle XML inbound tables.
Confidential, NJ
Java Application Developer / Senior Consultant
Responsibilities:- Worked on the parser module to parse collected Syslogs data by validating data based on Rule-based framework.
- Worked on the Installer/Deployment and Scalable Mobility Logging System projects, which run on the Red Hat Linux platform.
- Responsible for automation of software deployment in Hadoop/2.2 environment, involving a Controller and Collector and CLI (Command Line Interpreter) applications to collect the Syslogs from various Network nodes and pushing the same into Hadoop Environment comprising of Distributed Data Warehouse and Oracle/Vertica databases, and generating analytic reports for collected Syslogs in real-time.
- Participated in Build/Release management and used various tools, namely Jenkins/Maven, to accommodate software configuration/build and change management of Scalable Mobility Logging system; it is a multi-threading environment where the software packages are built, tested, and deployed in real-time to local and remote nodes, as required.
- Wrote shell/PERL scripts to monitor the health of the Scalable Mobility Logging System components.
- Developed test scripts to check various modules correctness so that the modules data points remain the same before and after the SMLS component changes associated with Syslogs.
- Worked with JavaScript/HTML screens related to GUI applications to provide SysLog application for Confidential &T.
Confidential, NJ
Senior Application Developer / Senior Consultant
Responsibilities:- Worked on Migration project of Netscape iPlanet Webserver application to Linux-based IBM HTTP Server to use the IBM WebSphere Application Server.
- The original application uses the NSAPI (Netscape Application Programming Interface library), which was implemented using Solaris multiprocessing threads in C++ using object oriented approach on the Solaris Platform; the main goal is to rewrite the application based on the Apache based Portable Runtime API that required the following:
- Decomposed the application and eliminated the NSAPI Library interface calls.
- Re-implemented the NSAPI functionality using newly designed and developed C++ objects to meet the application requirements.
- Designed and developed a C++ based lexical analyzer and a parser to parse the NSAPI object configuration files and create a parse tree to walk through and reconstruct the NSAPI rule-based engine to get integrated into Apache APR environments using POSIX multiprocessing threads.
- Involved with REST/API technology and HTTP protocols, worked on IBM HTTP Server 8.5 (Apache-based webserver), and developed an Apache Module to parse Object Configuration Files and invoke the Application Callback Service routines based on rules-based patterns such as URL pattern configurations specified in the Object Configuration file.
- Apache APR environment/Apache Webserver is integrated into WebSphere Application Server 7.0/8.5.5 running J2EE environment to handle JAVA/JSP/Servlet requests embedded into JAVASCRIPT pages received from the Web Browser to improve the overall turnaround time and sending responses from WebSphere and ORACLE backend server back to the web browser to display HTML pages.
- Worked with JavaScript/NodeJS/AngularJS on Single SignOn application
- Integrated OpenSSL package to establish SSL based encryption/decryption between web clients and web server.
- Designed and developed Software Configuration management tools using the Make/Bourne shell scripting and integrated the application into a cross-platform based Cloud development environment that uses the IBM ClearCase based Change Management methodology.
- Involved in Market Data Development and provided API interfaces to fetch the data from Web/Servlet containers via Application Server using JAVA/JDBC/SQL/Q API from Oracle/OneTick/KDB database containers.
- Created test scripts to test the application and integrated into the QA environment.
- Worked with Fixed Income/Equities/Futures/Options applications and associated derivative products.
- Developed scripts using PERL to monitor health of day-to-day downstream and upstream processes and their completion statuses.
- Developed build tools, namely Make/Ant/Bash/PERL in UNIX/LINUX environments to build C/C++/Java applications.
Confidential, DE
Senior Application Developer / Senior Consultant
Responsibilities:- Worked in the Claims Technology Department, designing and developing fraud detection and claims processing applications using distributed transaction technology framework provided by Oracle Tuxedo technology, also using Informix IDS 11.50 version.
- Designed and developed stored procedures, triggers, and SQL code for the Claims application using Informix data store.
- Primarily, the Claims application was developed in C++ on HP/UX platform using distributed transactions provided by Oracle Tuxedo as well Rogue Wave Tools++ to develop client API interfacing with Informix IDS server using advanced SQL programming; also used the Rogue Wave Tools++ Multithreaded package to improve latency of the client’s application.
- Involved in improving performance by optimizing the C++ algorithms as well rewriting the Informix Stored procedures in anticipation to improve fast access to database for Claims Technology.
- Involved in Release Management using the Rational Team Concert (IBM Product for SCM and Release management)
- Developed automated build management tools using the hierarchical tree structure for the source directories employing Make tools customized for the Claims Technology application.
- Involved in the design of Defect Management for the Claims technology application in creating, assigning, and tracking the defects associated with developers, and resolving the same for the Claims Technology application.
- Lead the projects development teams of Claims Technology in defining the release requirements, design, implementation, and deployment in a timely manner.
- Worked on the Regulatory and Compliance department to meet the regulatory requirements, and enhanced the Credit Claim processing application to comply with Government, Confidential, Confidential, and Confidential regulations; implemented procedures related to preserving exhibit and compliance FORMS and storing the same into IBM FileNet service in a Distributed Transaction environment for Fraud Claims processing.
- Augmented several risk management procedures to promote and eliminate the Fraud Management Risks associated with bank-issued credit cards.
- Implemented Core Java Development modules to talk to Oracle Tuxedo JVM to establish communication between JOLT APIs Java-enabled clients to talk to the Oracle Tuxedo Distributed Transactions services implemented in C/C++ UNIX; to send data to and fro across business clients based on self-describing formats such as XML and tag value pairs text files.
Confidential
Senior Programmer Analyst
Responsibilities:- Provided system level services and interfaces to a Point of Sale (POS) application in the ISD Department where the environment was UNIX/C/C++/Java, Linux (Red Hat/openSUSE), Cygwin, MS/Windows, IBM Visual Age C/C++, ClearCase/ClearQuest, Collabnet/Team Forge/Subversion.
- Provided API interfaces using Object Oriented Programming, and assisted in re-architecting the current POS system to improve performance by means of employing multi-threading/multi-tasking concepts.
- Worked on client server architecture to distribute retail market data efficiently across Walmart sites geographically spread both within the U.S. as well as internationally throughout Europe (Great Britain, Germany), Asia (Japan, China, India), South America, Africa, and other countries.
- Worked on several modules of C-BASIC modules and ported into equivalent C/C++ modules to run in 4690 OS systems.
- Implemented a data dictionary component and associated APIs to serialize/de-serialize the POS application objects to communicate data among the clients and servers of POS application.
- Implemented Auto Test Daemon to mechanize and test transaction log objects.
- Provided a test environment to improve the software quality, which minimized the defects in the application and thereby improved the quality of the software tremendously.
- Ported the retail application, spanning several million lines of code, to openSUSE / Linux environments; wrote application makefiles, built the shared libraries, and packaged the applications ready to be deployed.
- Built a customized JSON parser using BISON and FLEX tools
- Improved performance of QA of POS system
- Worked on the design of Build Release Management and Change Management using Team Forge/Subversion/AnthillPro for managing source code and release deployment of POS retail systems.
- Worked with HP ALM (Application Lifecycle Management), HP QC (Quality Center) to track and integrate POS Build Release Management and Change Management.
- Worked with Core/Java Modules to interface with Sales Terminal Applications in collecting and reading customer transactions data from sales terminals to a centralized Informix database using the JDBC.
Confidential, NJ
Senior C++ Consultant
Responsibilities:- Worked on real-time analytics reporting using UNIX/C system level programming on Sun/Solaris and Linux platforms running Red Hat Linux for Mobility (wireless voice database), SMS (wireless small messaging/txt database), AWSD (wireless/VOIP database), landline (phone/VoIP database), International Call Detail Reporting and various other telephonic databases for Confidential &T clients.
- Worked on data manipulation functionality in merging the raw data coming from various electronic switches supplied by different vendors, namely Nortel, Ericson, Nokia, and Lucent.
- Used Cymbal, a 4GL tool, to build and read telephonic databases with built-in indexes and highly partitioned data horizontally divided according to regions/customer segments/carrier subscribers based on granularity of hours, minutes, and seconds.
- Developed and implemented Window Arbitration algorithms to retrieve the data of several million phone call detail records in real time.
- Worked on parallelization techniques that were used to employ multitasking methods to fetch data from multi-hosts having network connectivity nationwide.
Confidential, NY
Senior Associate
Responsibilities:- Worked in the Enterprise Infrastructure Data Group that manages the market data activities related to historical end of day (low frequency) and historical real-time market tick data (high frequency)
- Real-time market tick data involved collecting data from derived real-time feeds from sources: FILTER network (such as TIBCO RV) as well NYSE/TAQ.
- Designed and developed a Sybase/Open server API to access data from an In Memory KDB/Q database.
- Developed a C/C++ API for users to access tick as well end of day time series data.
- Worked as a backend server side developer to retrieve and distribute data to the clients of business units.
- Implemented historical end of the day data server using Sybase Open Server technology and Legacy Back end server for getting time series data from a customized data store for analytical purpose.
- This system supported all the business units in providing historical data used to develop their applications.
- Provided on call support to clarify the ambiguity of trade prices arising out of the trading systems applications querying the pricing data against the market data servers.
- Implemented Foreign Exchange/Currency Conversion routines to translate ticker prices from source currency to target currency using currency rate tables stored in a memory cache to improve the latency of trading transaction queries.
- Implemented various statistical algorithms to quantize the trade prices characteristics in Moving Window of time series such as Moving Volatilities, Averages, Standard Deviation, Low, High, Median pricing algorithms as well VWAP calculations on time series data, and applied Foreign Exchange rate conversion algorithms to convert trade prices from native to target currencies and vice versa.
- Enhanced the functionality as well as improved the latency from time to time as the technology changed with high capacity stores to handle high volumes of data as well its retrieval.
- Implemented the architecture to handle data retrieval as efficiently as possible by employing multitasking and multi-threading techniques to improve the latency of the server much further.
- Migrated the existing back end server to read KDB data from KDB low-frequency and high frequency data stores.
- Converted Existing Time series DB Schema using E/R concepts employed in Data Modelling and translated to fit into KDB/Q container by implementing a Data dictionary map so the user interface remains unchanged and, Confidential the same time, New Container (KDB/Q) comes into existence with equivalent DB schema tables, thus replacing the old database container.
- E/R data modelling techniques enabled automatic conversion of the existing client requests being translated directly into KDB/Q-SQL container requests ready to be executed against the KDB database.
- Developed C/C++ API’s based on known design patterns, namely Master/Slave, Producer/Consumer, Client/Server, and Publish/Subscribe scenarios that used heavy Multithreading/Concurrency (Pthreads/Solaris threads) techniques.
- Responsible for client server computing for end of day/real-time tick market data by using TCP/IP socket interface programming as a part of a UNIX/C/C++ API in distributing the data over the network to business units.
- Worked on data distribution that involved several multithreading/concurrency and parallelization techniques in fetching data for multiple ticker symbols simultaneously.
- Responsible for parallelization by splitting the distribution of tickers, grouping them into several batches over a dedicated bidirectional socket channel assigned to each of the task processes from a pool of created TCP/IP socket channels.
- Improved the latency within each task process by multithreading, which is employed Confidential the ticker level to achieve the maximum concurrency to fetch the ticker data Confidential the same time.
- Improved overall data access times for a set of batch symbols, and thus the system could achieve maximum desired low latency in providing the market data to the end user; automated buffering was employed to manage data for the desired records and fields of the ticker symbols.
- Built ETL tools to collect real-time data from real-time feeds to build customized tick databases.
- Designed and developed a Real Time Loader tool using Informix data blade RealTimeLoader API to store the trades and quotes ticks into the Informix time series database container.
- Ported and enhanced Real-time Client/Server programs from 32-bit into 64 bit host machines.
- Designed a prototype of time series databases using DB2 to study whether it is feasible using the DB2 environment.
- Designed and developed real time data analysis tool to eliminate unused and erroneous processing of ticker symbols.
- Designed and developed an interface to a database table index organization, which improved access times tremendously in retrieving the BTREE database tables.
- Designed and developed TQL Optimizer for MSDB historical time series and relational databases.
- Designed and developed the parallelization component to access time series data based on a multi-threated Solaris environment with MASTER/SLAVE paradigm and integrated into Sybase open server.
- Designed and developed a Cache Manager to manage the database pages (blobs) into dynamic and static caches to access historical quotes database via Sybase Open Server.
- The Dynamic Cache was implemented using the LRU principle and the Static Cache was implemented using update in place strategy.
- The Cache Manager implementation resulted in a drastic performance improvement.
- Developed PERL scripting modules for building software modules in conjunction with makefiles.
- Used PERL scripting to parse Log files and generate reports of application module runtime completion times.
- Also used PERL scripting to build software modules on different platforms and to generate the source file dependencies as well as compilation and linking, building and installing the releases for business clients.
- Designed and developed ETL utilities that load the data from historical time series data feeds in a UNIX batch environment.
- Worked in a DATALINK project and wrote JAVA/XML API library interface tools for accessing historical and real-time time series data via MSDB SYBASE/Open Server.
- This involved design and development of Core JAVA classes to provide the interface as a client to send requests and collect responses from the MSDB SYBASE open server, which captures requests from the JAVA client and sends binary compiled data as a response to the JAVA client in optimized manner.
- The client parsed the response and generated reports for the business clients.
- The JAVA clients basically retrieve time series information from proprietary databases, namely Capital Markets/Equities, Fixed Income, Bonds, and Equity Derivatives (Futures/Options/Capital Markets).
- Used the Memcached tool for speedy access of pricing data in the applications.
- Designed and developed ETL utilities that load historical data from self-describing input data files.
- Designed and developed a YACC/LEX based parser API that parsed the extended SQL (XSQL) into data structures that enabled building of several layers of APIs.
- Implemented a SQL Language compiler with added time dimension to the columns list/SQL expressions of SELECT/UPDATE/DELETE/INSERT to handle the time series queries based on the financial analytics API, which was developed completely based on the Abstract SQL syntax tree and attribute based expression grammar for parsing SQL expressions.
- Developed APIs and user statistics tools for accessing the commercial historical time series databases, namely LIM, FAME, Informix REALTIME Tick DB.
- Implemented the Quality Assurance mechanism by developing data consistency check utilities, and building regression test suites.
- Involved through complete software development cycle (SDLC) of market data software design, development, testing, and deployment for historical end of day and intraday tick time series database.