Senior Solution Developer Consultant Resume
SUMMARY
- Senior technology leader with over 20 years of experience in the successful planning, development, and execution of technology strategies that address the unique business needs within the financial and service industry
- Experienced team builder, leader and mentor with exceptional motivation and management expertise with global team
- Strong track record with highly successful in leading IT teams to deliver data acquisition, calculation, collateral and margin management, risk management, settlement and straight through processing solutions
- Partnering with business, and operation across the wider organization to formulate and execute technology road maps, ensure cost transparency, efficient resource allocation, effective project prioritization and strong business relationship
TECHNICAL SKILLS
Data Management: Sybase, Oracle, DB2, MySQL (Relational), Mongo DB, Cassandra (NoSQL), HDFS
Infrastructure: Tibco EMS Messaging, IBM MQ Messaging, Gemfire Distributed Caching, jBPM Workflow, Drools Rule Engine, WebLogic/Tomcat Application Server, Spring Framework, HTTP, RESTful
Development: Eclipse, Ant, Maven, Jenkins, Nexus, JUnit, Mockito, ClearCase, SVN, GIT
Language: Java, C++, C, XML, JSON, HTML, Shell, Scala
Performance: Wily Introscope, AppDynamics, Hotspot JVM G1GC, JRockit, JProfile
PROFESSIONAL EXPERIENCE
Confidential
Senior Solution Developer Consultant
Responsibilities:
- Managed life cycle of trade alert to help internal compliance team to perform deep - dive analysis and investigation
- Consumed, distributed and captured intraday firm wise trade alerts using Apache Kafka
- Transformed input source specific XML into Java object, translated source specific value into internal standard value, enhanced and derived value for later pipeline processing using Spring Integration
- Persisted semi-structure trade alert mandatory fields and optional detail information into MongoDB collections.
- Evaluated real time event push technology using HTML 5 WebSocket running in embedded Jetty Web Server, and Apache Kafka with Java Script running Node.js server
Confidential
Senior Application Developer/Architect
Responsibilities:
- Created 3 years target state vision to consolidate multiple margin and collateral systems into shared flexible platform to handle rapidly regulatory and industry changes, aging out dated and inconsistent tools and complexity
- Engaged and evaluated vendor product to identify emerging technology solutions that reduce costs, increase efficiencies, and provide better distributed and scalability capabilities
- Exercised proof-of-concepts on high availability, fully fault tolerance, flexible data model, and scalable solution using NoSQL MongoDB, Cassandra as well as Big Data Hadoop technologies
- Planned to replace TIBCO EMS messaging solution with Apache Kafka and Storm for real time streaming computation
- Good understanding of operating system models (Linux), hardware architectures, distributed computing architectures, storage technology, server virtualization, memory management, networking, system maintenance, disaster recovery, database failover models, database theory and design
- Hosted architecture and design working group to setup governance process for future project to review application, infrastructure, data, technology, tool, security, regulatory, and requirement.
- Created best practice and process standard for development environment, continuous build and testing, peer code review, mandatory unit testing, standard problem tracking, nightly integration, and nightly code quality and testing reports.
- Worked with technology subject matter expert to document design pattern and best practice for messaging, JVM GC setting, distributed caching, NoSQL read and write concern, data persistent, and exception handling.
- Implemented with Linux, Cassandra, MongoDB, Hadoop, Kafka, Storm, Spark, Java JVM 1.7 G1GC
- Conducted open server technology proof of concept to replace current Tibco EMS and Gemfire Distributed cache
- Consumed upstream external data sources into internal Apache Kafka cluster to serve as the central data backbone without the need off using messaging broker to track message deliver overhead
- Used open source Storm-Kafka spout to consume Kafka message stream and track message offset as the input source stream to the distributed real time position and cash balance computation bolts
- Configured topology to use “account” field stream grouping to go to the same computation task and remove distributed Gemfire caching repository
- Designed generic margin calculation framework to define standard calculate input and output, load balancing, input validation, exchange data dependency, abstract calculation specific adapter, tracking and monitoring calculation request.
- Communicated and integrated Equity/Future Option/Derivative cash payment with risk approval, portfolio accounting transaction, treasury funding system and end of day reconciliation on position and balance
- Integrated and consumed multiple data sourcing from firm wise client, account, market reference data, pricing, eligibility, calculation, margin requirement, collateral allocation, asset movement, call management, and settlement components.
- Developed margin payment approval processing with maker/checker and integrated with back office wire transfer
- Developed intraday transaction processing update using distributed cache to maintain current position balance to provide intraday changes to support margin payment proof processing
- Produced margin result into margin operation data store; extracted cleared derivative contracts to firm wise data warehouse to generate regulatory EMIR reports
- Implemented indexing and searching with Apache Lucene integrated with RESTful service API.
- Conducted proof-of-concepts to resolve high availability, scalable partition, hot-hot solution to introduce NoSQL technology (MongoDB, Cassandra), distributed caching and snapshot data set context management.
- Implemented with Linux, Gemfire Distributed Cache, Tibco EMS Messaging, JSON Web Service, Oracle, Spring, HTML
Confidential
Senior Application Developer
Responsibilities:
- Designed and developed a risk analysis control system to determine the extension of credit to Confidential clearing clients. To ensure that clearing accounts are capitalized sufficiently to survive wide range of possible market conditions.
- Calculated theoretical profile and loss on client portfolio through +200 stressed market conditions,historical events and statistical analysis, plus ongoing adaption of changing market conditions
- Utilized distributed caching to store temporal data and partition data based on interest list for concurrent processing.
- Implemented data loading/staging/processing for client/account/position/market/pricing data with data availability event
- Implemented with Weblogic Application Server and JMS Messaging, Java, Oracle, Gemfire Distributed Caching, XML/JSON Web Service Spring
- Designed and developed real time messaging distribution environment to handle message distribution, flow control, replay, hot-hot solution, virtual pub/sub message consumption, centralize monitor and alert runtime platform.
- Implemented messaging service platform for event driven shared service across multiple applications
- Implemented with Tibco EMS Messaging, Gemfire Distributed Cache, Java
- Designed data and reporting shared service to support entire prime broker suite applications. As long as application finished processing and produced result, data through ETL layer persisted to data warehouse, event triggered for report generation, report published to distribution service and persisted in client folder ready for download and view.
- Implemented with Weblogic Application Server and Messaging, Java, Oracle, XML/JSON Web Service, HTML, Flex
- Designed end of day, end of region, on demand scheduling management processing framework to handle application requirements for pipeline processing, process and data affinity, unit of work isolation, process and thread grid execution, and integrate exception and alert handling. The main design consideration was focus on batch processing with scalable throughput execution framework.
- Implemented with Weblogic Application Server and Messaging, Java, Oracle, XML/JSON Web Service, Spring