We provide IT Staff Augmentation Services!

Senior Developer & Architect Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Completely hands - on & expert in Core Java, JEE & Big-data technologies:
  • Core Java, Multithreading, Hadoop, Cloudera, Hortonworks, Hazelcast, Gemfire, Spring, Hibernate, JMS, EJB, Web Services, Servlets, JSP, XML, XSLT, JDBC, Swings
  • Expertise in designing low latency, high intensive, multi-threaded, streaming applications using:
  • Spark: Implemented Spark streams, transformations, actions for Market Risk analytics
  • Spark-SQL: Experience in fetching data from Hive tables, Parquet & JSON files into DataFrames and efficiently using SQLs to filter and process data in RDDs
  • HBASE: have used to upsert & read terabytes of transactional & reference data
  • Sqoop: have used it to transfer Risk data from Oracle into HDFS
  • Hive: have partitioned the data on Hive using Date for Market Risk. Designed & implemented both internal and external tables in Hive to optimize the performance
  • Cloudera & Hortonworks: have effectively used these to administrate Hadoop ecosystems
  • Hazelcast: have used it to achieve extreme performance on medium range pre-calculated data with 100% in-memory solution
  • MongoDB: have used it to achieve better performance on huge document data
  • Multi-threads: have used thread executors to parallel Querying, Processing & Staging the huge volume of data for quicker user analytical operations

DATABASES, SERVERS & OTHER TOOLS:

Databases: Oracle 11g and DB2

App servers: Websphere 8.0 and Weblogic 12

Tools: JProfiler, Eclipse, Rational Rose, Enterprise Architect, TLM, MS Project Plan, Visio

Financials: TLM, SWIFT, FpML & ProtoBuf

Domain Experience: Credit Risk, Market Risk, Cash Management, OTC Derivatives (CDS, Interest Rates & Equity), Automated electronic trading (High frequency) development for Forex (FX) market using MACD & RSI algorithms

PROFESSIONAL EXPERIENCE:

Confidential

Senior Developer & Architect

Responsibilities:

  • I have developed Step framework to load data into HBASE using MapReduce and as well as sequential load after required enrichment is done
  • Developed Spark scripts to perform various aggravations for Analytical & Reporting systems
  • Fine-tuned HBASE salting & splitting for better region splits across RegionServers to achieve better performance
  • Accessed application’s performance and suggested required cluster adjustments

TECHNOLOGIES USED:

Java & JEE: Java 8, HBASE, Phoenix, Spark, Hadoop, Google Protobuf, REST API, Multithreading, Junit

Server& Tools: Linux, Eclipse, Maven, Git, Microsoft Project Plan, JProfiler, Visio

Confidential

Senior Developer & Architect

Responsibilities:

  • 9 frequently used days data are persisted in Spark RDD in-memory with pre-calculated summary & trade levels which resulted in high performance (5 to 10secs)
  • Rest of historical days are processed on-demand basis using Spark RDDs on Hive/HDFS to get reasonable response time (20 to 30secs) which avoided huge in-memory clusters
  • Environment configuration: 12 Nodes * 64 GB RAM with 12 * 256 GB Disk space

TECHNOLOGIES USED:

Java & JEE: Java 7, Cloudera CDH 5.7: Hadoop (Spark & Spark-SQL with Hive), Spring Rest API, JDBC, Multithreading, Hazelcast & Junit

Servers: Hadoop Cluster, Weblogic 12, Linux, Oracle 11g

Confidential

Senior Developer & Architect

Responsibilities:

  • Designed scalable low latency analytical application framework with multi-threaded
  • Defined interface to communicate between UI and Analytics application and also with external Transfer Pricing, Reference Data & Market data systems
  • Built API for MongoDB access and setting up MongoDB for 256GB in-memory cache with 4 nodes

TECHNOLOGIES USED:

Java & JEE: Java 7, Spring, Hibernate, MongoDB, EMS (TIBCO), JMS, Multithreading, Gemfire

Deployment: Weblogic 12, Linux with Oracle 11g

Confidential, New York

Technical Architect & Senior Developer

Responsibilities:

  • Analyzed & Designed common interface to get feeds from various upstream systems; have also coordinated and got this implemented
  • Designed Gateway with multi-threaded to process different systems incoming feeds, validate, parse, stage in Gemfire and feed to TLM
  • Fine-tuned the application performance with Gemfire and good number of threads & connections and heap size

TECHNOLOGIES USED:

Java & JEE: Java 6, Spring, Hibernate, Gemfire, JMS, Struts, Multithreading, JUnit

Deployment: Weblogic 9.3, Linux with Oracle 11g

We'd love your feedback!