Senior Java Developer/senior Consultant/backend Developer Resume
Newport Beach, CA
SUMMARY:
- Confidential: Java 1.8 / Multithreading / Spring Frameworks /Spring Boot/ IBM MQ(Message Queues) / Oracle Database/Eclipse IDE/Netbean IDE/Maven/Gradle/SOA/MicroServices/Python/XML
- Worked on teh two different modules dat use teh above technology. One is STP (Straight Through Processing) and another Trade Flow Daemon. Fixed outstanding issues and Enhanced these projects to handle Oracle Database interface which is and addition to teh existing Sybase Database interfaces and slowly teh company is planning to migrate to Oracle Technology.
- AT & T: Java 1.8/Java 1.7 / Multithreading / Hadoop 2.2 environment / Oracle Database / Vertica DataWhereHousing/Maven/Gradle/Jenkins and other build tools
- This project is an automation of installable deliverables to be deployed in teh PROD/DR/TEST/DEV environments comprising of several hundreds of Hadoop Name and Data nodes in teh AT&T network Environment. Enhanced teh build automation technology in real time to get teh deliverables installed and update their status during teh deployment coveringvarious staging Scenarios such in Prod/DR/Test Environments.
- Worked with JavaScript/HTML screens related to GUI applications to provide SysLog application for AT&T.
- Citi: Java 1.8/1.7 Java Script/WebSphere Application Server/IBM MQ/Servlets/IBM HTTP/Apache 2.0 Web Server/Tomcat Servlet Engine/Linux/Solaris/C++ API
- This is a migration project of NetScape Server API (NSAPI) to get it replaced byand integrated with teh IBM HTTP/Apache web server/Websphere Application Server/IBM MQ/JavaScript application modules. Teh work done in this project involve thecode enhancements to C++ / Java application modules to get migrated from Solaris Environment to RedHat Linux Environment.
- Worked with JavaScript/NodeJS/AngularJS on Single SignOn application to provide screen interface.
- BankOfAmerics: Java 1.6/IBM MQ/Oracle Tuxedo Environment/Informix Database/C++ and Core Java.
- This project is a claim processing technology and credit card compliance and involved implementing teh Clients and Servers using teh Oracle Tuxedo and related technology. Very much involved in enhancing teh product using teh IBM MQ (Message Queues) to read teh account data from Mainframes to UNIX boxes in processing teh fraud claim requests received from teh card account owners.
- Over 20 years of extensive experience in teh design, development and implementation of financial, telecommunication, retail market industry, and scientific based systems in UNIX, C/C++, Perl, Java, World Wide Web.
- Developed applications in Relational/SQL database systems such as Sybase, Informix, Oracle, DB2; proprietary financial based historical end of day time series and intraday real - time tick databases; recently built C/C++ APIs to access in-memory KDB/Q databases.
- Proficient in UNIX/OS internals, and has worked in local area networks, distributed processing: TCP/UDP/IP Sockets programming, IPC (Inter Process Communication), shared memory, message queues and semaphores.
- Implemented temporal extensions to SQL Language to handle teh time dimension in addition to rows/columns of relational model to SQL to be able to query teh time dependent data using teh compiler construction tools such as YACC/LEX/FLEX/BISON.
- Familiar with vector/functional programming tools, namely Scala, KDB/Q programming languages to develop applications.
- Worked with real-time market data applications dat involved distributed processing and built APIs to collect data from derived real-time feeds from source data feeds such as PUB/SUB tick network, NYSE/TAQ, and other tick source data feeds; and also employed multi-threading techniques in developing teh same.
- Worked with High Frequency Data (intraday real time) with Low Latency (milli- and microseconds) achieved by providing interfaces via Sybase, Informix (time series data blade), LIM, KDB/Q, FAME, and proprietary time series database implementation in retrieving teh market data from Customized Ticker Plant as well Low Frequency Data (end of day) implementations.
- Most recently involved in Software Configuration Management and Change Management and POS Release Management using teh Subversion/Team Forge/AnthillPro technologies as well HPALM (HP Application Lifecycle Management).
- Used configuration management tools such as RTC (Rational Team Concert), ClearCase, ClearQuest, Perforce, and CVS.
- Used various design patterns, namely MVC, Observer (PUB/SUB), Client/Server, Producer/Consumer, Master/Slave, and implemented teh multi-threaded server to achieve pricing data distribution to business clients.
- Worked on projects involving Regulatory and Compliance procedures in a financial environment.
- Worked on IBM HTTP Server 8.5 (Apache-based webserver) and developed an Apache Module to parse Object Configuration Files and invoke teh Application Callback Service routines using rules-based patterns such as URL pattern configurations specified in teh Object Configuration file.
- Integrated HTTP Server into IBM WebSphere 7.0/8.5 to process teh Servlet/JSP associated with Single Sign On applications.
- Integrated OpenSSL package to establish SSL-based encryption/decryption between web clients and a web server.
- Implementedproprietary systems to handle creating and distributing several terabytes (Big Data) pricing data comprising daily and real-time feeds, namely: end of day pricingtime seriesforEquity Marketsand Equity Derivatives, intraday time seriesforvarious Exchange Real-time Feeds such as RDMS,NYSE/TAQ,CBOT Options/Futures,OPRA, and LSE feeds.
- Provided C/C++/Java APIforpricing analytics,TWAP,VWAP computationStandard Deviation,Moving Averages,and Volatility calculations.
- Experienced with large database management systems and grid computing (MapReduce, etc.), namely Sybase, Informix, and Oracle.
- Familiar with Hadoopdistributed database technology which basically uses teh Map Reduce functionality and distribution of work load in a clustered node environment.
- Varied experience in writing scripts and build tools inUNIX/Linux to build and deploy software packages, namely SH/KSH/CSH/BASH/Make/Gmake/SED/AWK/PERL and most modern tools, such as Ant, AnthillPro, and Maven.
- Used SOAP and REST based services with XML/SOAP and Apache/HTTP/REST APIbased frameworks.
- Used IBM MQ Series to establish message communication in Oracle Tuxedo distributed transaction environments.
- Used Python Scripting tool, andhave a good grasp of data structures such as sets, lists, dictionaries, threading andunderstanding of listcomprehensions and generator expressions.
- Also focusing on NoSQL databases and knowledge of Python bindings to NoSQL databases, namely MongoDB (Collections Hierarchies),and Cassandra (Column Based framework).
- Used teh Spring Frameworks/Java 1.8 and used teh Spring Transaction Management/Hibernate/JPA tools.
- Involved in handling teh application configuration and control data associated with name and data nodes in teh Hadoop Cluster Environment to manipulate and control teh configuration in Oracle Database.
- Also worked with JUnitframeworks, as well how tomock teh objects for testing teh component interfaces in Java technology.
- Monitored teh Netcool/Omnibus generated alerts related to teh applications for teh good health of managing them.
TECHNICAL SKILLS:
Languages: Java, C/C++, PYTHON, SQL, JSON, XML, Visual Age, OS 4690.
Technologies/Tools: UDP/TCP/IP, Multicasting, Sockets Programming, POSIX Threads, Solaris Threads, Ethernet, FTP/ NFS, TELNET, SH/CSH/KSH, PYTHON, Perl, SED/AWK, PERFORCE, CVS, HPALM, YACC/LEX/BISON/FLEX, JSP, JDBC, Servlets, XML (DTDS), XPATH, XPOINTER, XSLT, XML, ANT, ECLIPSE, Message Queues, SEMAPHORES, Shared Memory, HTML, JavaScript, HPQC, ROGUEWAVE/TOOLS++, IBM HTTP Web Server, IBM WebSphere App Server, Rest API, HTTP Protocol, Hibernate, Spring, SCALA, NodeJS, AngularJS, React, Redis, Swagger, GitHub, Docker, YAML, RAML, Knex
Database APIs: PostGres, MySQL, Informix TimeSeries Real-Time Loader, DB2, KDB/Q, OneTick, Oracle, Oracle Call Interface
PROFESSIONAL EXPERIENCE:
Confidential, Newport Beach, CA
Senior Java Developer/Senior Consultant/BackEnd Developer
Responsibilities:
- Worked on teh Fixed Income Fund Transfers using Python module Flask which provides Micro Services capabilities in a Flask restful framework. Teh major functionality is to transfer teh fixed income fund complexes such as from source accounts to destination accounts in highly scalable and HPC architecture in a transaction environment such as teh Oracle Database and to virtual computing environment using Red hat Linux operating system.
- Designed and developed Web API based software modules in a Restful frameworks to communicate with various banking applications
- Used teh configuration management tools such as SVN, artifacts based change management (Team Forge), Eclipse IDE/neon, Maven 3.39, ANT, Graddle, Remote Debugging on Linux to aid and perform teh complete SDLC operations meeting teh requirements and fulfilling their day to day needs.
- Worked on Straight Through Processing - STP to eliminate teh human intervention and minimize teh settling time of financial security transactions such as fund allocation/distribution of Fixed Income securities and pushed teh data into Oracle 11g databases using Java 8/JDBC Templates and XML configuration files.
- Performed Migration of Sybase tables/Transact SQL implementations of STP to use teh counter party Oracle/PLSQL 11g technology infrastructure using teh JAVA 8, Spring Frameworks/v4.1.9, multithreading and multiprocessing concurrent features. Heavily used teh Spring Frameworks in defining and using teh Spring Transaction management/Spring Beans/JDBC Templates to persist teh data in Oracle Database.
- Worked on Trade Flow daemon implemented in JAVA 8 technology. Teh Trade Flow Daemon is an ETL process which extracts, transforms, and load teh BBG feeds data connecting to teh Bloomberg API feeds which collects teh information from TCP/UDP based networks creating XML documents by disseminating teh ASCII text based records coming from Bloomberg feeds and persisting teh same into Oracle database for further streaming processing via STP application which involved teh implementation of JAVA ETL processes to manipulate teh data to/from Oracle XML inbound tables.
Confidential, NJ
Java Application Developer / Senior Consultant
Responsibilities:
- Worked on teh parser module to parse collected Syslogs data by validating data based on Rule-based framework.
- Worked on teh Installer/Deployment and Scalable Mobility Logging System projects, which run on teh Red Hat Linux platform.
- Responsible for automation of software deployment in Hadoop/2.2 environment, involving a Controller and Collector and CLI (Command Line Interpreter) applications to collect teh Syslogs from various Network nodes and pushing teh same into Hadoop Environment comprising of Distributed Data Warehouse and Oracle/Vertica databases, and generating analytic reports for collected Syslogs in real-time.
- Participated in Build/Release management and used various tools, namely Jenkins/Maven, to accommodate software configuration/build and change management of Scalable Mobility Logging system; it is a multi-threading environment where teh software packages are built, tested, and deployed in real-time to local and remote nodes, as required.
- Wrote shell/PERL scripts to monitor teh health of teh Scalable Mobility Logging System components.
- Developed test scripts to check various modules correctness so dat teh modules data points remain teh same before and after teh SMLS component changes associated with Syslogs.
- Worked with JavaScript/HTML screens related to GUI applications to provide SysLog application for AT&T.
Confidential, NJ
Senior Application Developer / Senior Consultant
Responsibilities:
- Worked on Migration project of Netscape iPlanet Webserver application to Linux-based IBM HTTP Server to use teh IBM WebSphere Application Server.
- Teh original application uses teh NSAPI (Netscape Application Programming Interface library), which was implemented using Solaris multiprocessing threads in C++ using object oriented approach on teh Solaris Platform; teh main goal is to rewrite teh application based on teh Apache based Portable Runtime API dat required teh following:
- Decomposed teh application and eliminated teh NSAPI Library interface calls.
- Re-implemented teh NSAPI functionality using newly designed and developed C++ objects to meet teh application requirements.
- Designed and developed a C++ based lexical analyzer and a parser to parse teh NSAPI object configuration files and create a parse tree to walk through and reconstruct teh NSAPI rule-based engine to get integrated into Apache APR environments using POSIX multiprocessing threads.
- Involved with REST/API technology and HTTP protocols, worked on IBM HTTP Server 8.5 (Apache-based webserver), and developed an Apache Module to parse Object Configuration Files and invoke teh Application Callback Service routines based on rules-based patterns such as URL pattern configurations specified in teh Object Configuration file.
- Apache APR environment/Apache Webserver is integrated into WebSphere Application Server 7.0/8.5.5 running J2EE environment to handle JAVA/JSP/Servlet requests embedded into JAVASCRIPT pages received from teh Web Browser to improve teh overall turnaround time and sending responses from WebSphere and ORACLE backend server back to teh web browser to display HTML pages.
- Worked with JavaScript/NodeJS/AngularJS on Single SignOn application
- Integrated OpenSSL package to establish SSL based encryption/decryption between web clients and web server.
- Designed and developed Software Configuration management tools using teh Make/Bourne shell scripting and integrated teh application into a cross-platform based Cloud development environment dat uses teh IBM ClearCase based Change Management methodology.
- Involved in Market Data Development and provided API interfaces to fetch teh data from Web/Servlet containers via Application Server using JAVA/JDBC/SQL/Q API from Oracle/OneTick/KDB database containers.
- Created test scripts to test teh application and integrated into teh QA environment.
- Worked with Fixed Income/Equities/Futures/Options applications and associated derivative products.
- Developed scripts using PERL to monitor health of day-to-day downstream and upstream processes and their completion statuses.
- Developed build tools, namely Make/Ant/Bash/PERL in UNIX/LINUX environments to build C/C++/Java applications.
Confidential
Senior Application Developer / Senior Consultant
Responsibilities:
- Worked in teh Claims Technology Department, designing and developing fraud detection and claims processing applications using distributed transaction technology framework provided by Oracle Tuxedo technology, also using Informix IDS 11.50 version.
- Designed and developed stored procedures, triggers, and SQL code for teh Claims application using Informix data store.
- Primarily, teh Claims application was developed in C++ on HP/UX platform using distributed transactions provided by Oracle Tuxedo as well Rogue Wave Tools++ to develop client API interfacing with Informix IDS server using advanced SQL programming; also used teh Rogue Wave Tools++ Multithreaded package to improve latency of teh client’s application.
- Involved in improving performance by optimizing teh C++ algorithms as well rewriting teh Informix Stored procedures in anticipation to improve fast access to database for Claims Technology.
- Involved in Release Management using teh Rational Team Concert (IBM Product for SCM and Release management)
- Developed automated build management tools using teh hierarchical tree structure for teh source directories employing Make tools customized for teh Claims Technology application.
- Involved in teh design of Defect Management for teh Claims technology application in creating, assigning, and tracking teh defects associated with developers, and resolving teh same for teh Claims Technology application.
- Lead teh projects development teams of Claims Technology in defining teh release requirements, design, implementation, and deployment in a timely manner.
- Worked on teh Regulatory and Compliance department to meet teh regulatory requirements, and enhanced teh Credit Claim processing application to comply with Government, VISA, MasterCard, and American Express regulations; implemented procedures related to preserving exhibit and compliance FORMS and storing teh same into IBM FileNet service in a Distributed Transaction environment for Fraud Claims processing.
- Augmented several risk management procedures to promote and eliminate teh Fraud Management Risks associated with bank-issued credit cards.
- Implemented Core Java Development modules to talk to Oracle Tuxedo JVM to establish communication between JOLT APIs Java-enabled clients to talk to teh Oracle Tuxedo Distributed Transactions services implemented in C/C++ UNIX; to send data to and fro across business clients based on self-describing formats such as XML and tag value pairs text files.
Confidential
Senior Programmer Analyst
Responsibilities:
- Provided system level services and interfaces to a Point of Sale (POS) application in teh ISD Department where teh environment was UNIX/C/C++/Java, Linux (Red Hat/openSUSE), Cygwin, MS/Windows, IBM Visual Age C/C++, ClearCase/ClearQuest, Collabnet/Team Forge/Subversion.
- Provided API interfaces using Object Oriented Programming, and assisted in re-architecting teh current POS system to improve performance by means of employing multi-threading/multi-tasking concepts.
- Worked on client server architecture to distribute retail market data efficiently across Walmart sites geographically spread both within teh U.S. as well as internationally throughout Europe (Great Britain, Germany), Asia (Japan, China, India), South America, Africa, and other countries.
- Worked on several modules of C-BASIC modules and ported into equivalent C/C++ modules to run in 4690 OS systems.
- Implemented a data dictionary component and associated APIs to serialize/de-serialize teh POS application objects to communicate data among teh clients and servers of POS application.
- Implemented Auto Test Daemon to mechanize and test transaction log objects.
- Provided a test environment to improve teh software quality, which minimized teh defects in teh application and theirby improved teh quality of teh software tremendously.
- Ported teh retail application, spanning several million lines of code, to openSUSE / Linux environments; wrote application makefiles, built teh shared libraries, and packaged teh applications ready to be deployed.
- Built a customized JSON parser using BISON and FLEX tools
- Improved performance of QA of POS system
- Worked on teh design of Build Release Management and Change Management using Team Forge/Subversion/AnthillPro for managing source code and release deployment of POS retail systems.
- Worked with HP ALM (Application Lifecycle Management), HP QC (Quality Center) to track and integrate POS Build Release Management and Change Management.
- Worked with Core/Java Modules to interface with Sales Terminal Applications in collecting and reading customer transactions data from sales terminals to a centralized Informix database using teh JDBC.
Confidential, NJ
Senior C++ Consultant
Responsibilities:
- Worked on real-time analytics reporting using UNIX/C system level programming on Sun/Solaris and Linux platforms running Red Hat Linux for Mobility (wireless voice database), SMS (wireless small messaging/txt database), AWSD (wireless/VOIP database), landline (phone/VoIP database), International Call Detail Reporting and various other telephonic databases for AT&T clients.
- Worked on data manipulation functionality in merging teh raw data coming from various electronic switches supplied by different vendors, namely Nortel, Ericson, Nokia, and Lucent.
- Used Cymbal, a 4GL tool, to build and read telephonic databases with built-in indexes and highly partitioned data horizontally divided according to regions/customer segments/carrier subscribers based on granularity of hours, minutes, and seconds.
- Developed and implemented Window Arbitration algorithms to retrieve teh data of several million phone call detail records in real time.
- Worked on parallelization techniques dat were used to employ multitasking methods to fetch data from multi-hosts having network connectivity nationwide.
Confidential, NY
Senior Associate
Responsibilities:
- Worked in teh Enterprise Infrastructure Data Group dat manages teh market data activities related to historical end of day (low frequency) and historical real-time market tick data (high frequency)
- Real-time market tick data involved collecting data from derived real-time feeds from sources: FILTER network (such as TIBCO RV) as well NYSE/TAQ.
- Designed and developed a Sybase/Open server API to access data from an In Memory KDB/Q database.
- Developed a C/C++ API for users to access tick as well end of day time series data.
- Worked as a backend server side developer to retrieve and distribute data to teh clients of business units.
- Implemented historical end of teh day data server using Sybase Open Server technology and Legacy Back end server for getting time series data from a customized data store for analytical purpose.
- This system supported all teh business units in providing historical data used to develop their applications.
- Provided on call support to clarify teh ambiguity of trade prices arising out of teh trading systems applications querying teh pricing data against teh market data servers.
- Implemented Foreign Exchange/Currency Conversion routines to translate ticker prices from source currency to target currency using currency rate tables stored in a memory cache to improve teh latency of trading transaction queries.
- Implemented various statistical algorithms to quantize teh trade prices characteristics in Moving Window of time series such as Moving Volatilities, Averages, Standard Deviation, Low, High, Median pricing algorithms as well VWAP calculations on time series data, and applied Foreign Exchange rate conversion algorithms to convert trade prices from native to target currencies and vice versa.
- Enhanced teh functionality as well as improved teh latency from time to time as teh technology changed with high capacity stores to handle high volumes of data as well its retrieval.
- Implemented teh architecture to handle data retrieval as efficiently as possible by employing multitasking and multi-threading techniques to improve teh latency of teh server much further.
- Migrated teh existing back end server to read KDB data from KDB low-frequency and high frequency data stores.
- Converted Existing Time series DB Schema using E/R concepts employed in Data Modelling and translated to fit into KDB/Q container by implementing a Data dictionary map so teh user interface remains unchanged and, at teh same time, New Container (KDB/Q) comes into existence with equivalent DB schema tables, thus replacing teh old database container.
- E/R data modelling techniques enabled automatic conversion of teh existing client requests being translated directly into KDB/Q-SQL container requests ready to be executed against teh KDB database.
- Developed C/C++ API’s based on known design patterns, namely Master/Slave, Producer/Consumer, Client/Server, and Publish/Subscribe scenarios dat used heavy Multithreading/Concurrency (Pthreads/Solaris threads) techniques.
- Responsible for client server computing for end of day/real-time tick market data by using TCP/IP socket interface programming as a part of a UNIX/C/C++ API in distributing teh data over teh network to business units.
- Worked on data distribution dat involved several multithreading/concurrency and parallelization techniques in fetching data for multiple ticker symbols simultaneously.
- Responsible for parallelization by splitting teh distribution of tickers, grouping them into several batches over a dedicated bidirectional socket channel assigned to each of teh task processes from a pool of created TCP/IP socket channels.
- Improved teh latency within each task process by multithreading, which is employed at teh ticker level to achieve teh maximum concurrency to fetch teh ticker data at teh same time.
- Improved overall data access times for a set of batch symbols, and thus teh system could achieve maximum desired low latency in providing teh market data to teh end user; automated buffering was employed to manage data for teh desired records and fields of teh ticker symbols.
- Built ETL tools to collect real-time data from real-time feeds to build customized tick databases.
- Designed and developed a Real Time Loader tool using Informix data blade RealTimeLoader API to store teh trades and quotes ticks into teh Informix time series database container.
- Ported and enhanced Real-time Client/Server programs from 32-bit into 64 bit host machines.
- Designed a prototype of time series databases using DB2 to study whether it is feasible using teh DB2 environment.
- Designed and developed real time data analysis tool to eliminate unused and erroneous processing of ticker symbols.
- Designed and developed an interface to a database table index organization, which improved access times tremendously in retrieving teh BTREE database tables.
- Designed and developed TQL Optimizer for MSDB historical time series and relational databases.
- Designed and developed teh parallelization component to access time series data based on a multi-threated Solaris environment with MASTER/SLAVE paradigm and integrated into Sybase open server.
- Designed and developed a Cache Manager to manage teh database pages (blobs) into dynamic and static caches to access historical quotes database via Sybase Open Server.
- Teh Dynamic Cache was implemented using teh LRU principal and teh Static Cache was implemented using update in place strategy.
- Teh Cache Manager implementation resulted in a drastic performance improvement.
- Developed PERL scripting modules for building software modules in conjunction with makefiles.
- Used PERL scripting to parse Log files and generate reports of application module runtime completion times.
- Also used PERL scripting to build software modules on different platforms and to generate teh source file dependencies as well as compilation and linking, building and installing teh releases for business clients.
- Designed and developed ETL utilities dat load teh data from historical time series data feeds in a UNIX batch environment.
- Worked in a DATALINK project and wrote JAVA/XML API library interface tools for accessing historical and real-time time series data via MSDB SYBASE/Open Server.
- This involved design and development of Core JAVA classes to provide teh interface as a client to send requests and collect responses from teh MSDB SYBASE open server, which captures requests from teh JAVA client and sends binary compiled data as a response to teh JAVA client in optimized manner.
- Teh client parsed teh response and generated reports for teh business clients.
- Teh JAVA clients basically retrieve time series information from proprietary databases, namely Capital Markets/Equities, Fixed Income, Bonds, and Equity Derivatives (Futures/Options/Capital Markets).
- Used teh Memcached tool for speedy access of pricing data in teh applications.
- Designed and developed ETL utilities dat load historical data from self-describing input data files.
- Designed and developed a YACC/LEX based parser API dat parsed teh extended SQL (XSQL) into data structures dat enabled building of several layers of APIs.
- Implemented a SQL Language compiler with added time dimension to teh columns list/SQL expressions of SELECT/UPDATE/DELETE/INSERT to handle teh time series queries based on teh financial analytics API, which was developed completely based on teh Abstract SQL syntax tree and attribute based expression grammar for parsing SQL expressions.
- Developed APIs and user statistics tools for accessing teh commercial historical time series databases, namely LIM, FAME, Informix REALTIME Tick DB.
- Implemented teh Quality Assurance mechanism by developing data consistency check utilities, and building regression test suites.
- Involved through complete software development cycle (SDLC) of market data software design, development, testing, and deployment for historical end of day and intraday tick time series database.