- Extensive hands on experience in full project lifecycle working in different capacity as senior/leader developer in delivering high quality, mission critical, innovative and cost - effective products and processes for startup and large global organizations using various technologies including
- 9+ years of experience developing real-time analytics, data pipelining (data ingestion, cleaning, enrichment) and prediction engine using Java, Akka Apache Kafka, Storm, Flink, Sqoop, Flume, Spark Streaming, Spark MLLib, Scikit-learn, Cassandra, MongoDB, Cloudera/HDP stack (HDFS, HBase, Impala, MapReduce, Yarn Oozie, Pug, Hive, Kerberbos/SASL) and ElasticSearch
- 4+ years of experience using NOSQL database Cassandra and MongoDB building big data grid solutions
- 6+ years of experience in implementing SOA solution using various B2B, XML, web services and SOA design patterns and technologies. Used Oracle SOA 11g for multiple projects
- 5+ years of experience building API (RESTful and SOAP services) and also over 2.5 years in creating gateway services using APIGEE Edge platform
- 3+ years of experience building micro services using 12 factor app factor and Event Sourcing/CQRS and principals
- 3+ years of experience developing and deploying using Docker/Swam/Kubernetes/Mesos and CI/CD tools for various types of complex pipelines
- 2+ years of experience developing secure application using HP-Fortify and White Hat security products following OWSAP top 10 lists.
- 8+ years of experience using Agile/Iterative/XP/Scrum
- 5+ years of experience using Cloudera and Hortonworks Hadoop components
- 6+ years of experience working with remote team in different time zones
- 7+ years of experience working with large scale distributed big data platform
- Excellent Leadership, communication, analytical, technical and problem-solving skills
- Build relationships, credibility, establish rapport with stakeholders including external to the organization
Languages: Java, C++, C, C#, Scala, Python
NoSQL: Cassandra, MongoDB, HBase, Redis
Bigdata: Cloudera and HDP Stack (HDFS, Hive, MapReduce, Yarn, Impala, Oozie, Pig, HBase)
Analytics/Streaming/Workflow: Apache Storm, Spark Streaming Flink, Kafka Streaming, Kafka, Pig, Oozie Spark MLLib, Scikit-learn, Nifi, Informatica, Talend
Cloud Computing: AWS (EC2, ECR, EMR, Kinesis, Elasticsearh,Lambda, S3, RDS, and Google App Engine stack
Caching: EhCache, Terracotta, Big Memory, Coherence Cache
OSS Framework: Spring modules, Spring Cloud, Netflix OSS, OSGi, Hibernate, Guava, Apache and Google Libraries
Databases: MySQL, Oracle, Sybase, SQL Server, DB2, Postgres
CI/CD: GIT, Maven, Gradle SVN, Jenkins, Bitbucket
Search: Solr, Lucene, ElasticSearch
Logging/Monitoring: Splunk, ELK, Nagios
Middleware: J2EE, JMS, EJB, IBM MQ, AMQP, Sonic ESB, OpenJMS, Sonic MQ, Tibco RV, Oracle SOA 11g, Oracle Golden Gate, Solace ESB
Methodologies: Agile/XP/Scrum, Waterfall, Operation Excellence and Six Sigma, TDD
Design and architecture: Rational Rose, Visio, UML, Design Patterns, SOA, BPM, EDI, TOGAAF
Java/Big Data ENGINEER
- Architected, designed and developed multiple complex message flow pipelines, ETL processes with self-provisioning, integrated authentication and authorization with multiple source to get data from any source within internal systems and external partners system and also other partners) and then route to multiple destination systems, data lakes, monitoring systems
- Involved in development using using Java 1.8, Python, Akka, Kafka, Cloudera Stack, HBase, Hive, Impala, Pig, Nifi, Spark, Spark Streaming, ElasticSearch, Logstash, Kibana, JAX-RS, Spring, Hibernate, Apache Camel, RESTFul API, JSON, JAXB, XML, WSDL, Zookeeper, Cassandra, MongoDB, HDFS, ELK/Splunk, Docker, Kubernetes, CI/CD, Zipkin, Slueth, Prometheus, Sqoop, Flume, Oozie, Pig
- Apply OWASP rules to create secure application by integrating with WhiteHat, Fortify security product
- Automation of deployment and configuration management using Ansible.
- Worked on creating batch Spark jobs using Oozie workflow engines for periodic analytics and ingestion to Data lakes.
- Worked on Machine learning pipelines using PySpark, SparkML, Pandas, Scikit-learn, Numpy libraries to create and train model (used in prediction engine). Also used Matplotlib, Seaborn and Talend tools for visualization and analysis
- Involved in the installation, configuration and tuning of Cloudera Hadoop ecosystem
- Designed and developed caching layer using Oracle Coherence and Cassandra NoSQL database
- Designed column families and N node Cassandra clusters across multiple data centers
- Developed API proxies, products using APIGEE platform
- Developed operational analytics, financial analytics, model building and enrichment, prediction engine for both batch and real-time using Java, Storm, Kafka, Akka, Spark MLLib, Scikit-learn
- Responsible for performance tuning and scaling the middle tier to multiple data centers to serve over 200 million users.
- Apply OWASP rules to create secure application by integrating with White Hat security product. Also used Volt for encryption/decryption.
- Developed and deployed various micro services on AWS Cloud solutions (EC2, S3, Kinesis, Lambda, CloudWatch, CloudTrail, VPC, DynamoDB, EMR) and automation of deployment of the platform using custom framework based upon Chef and Puppet using 12 factor app patterns
- Developed Docker container based micro services using Spring Boot/Cloud and deployment on Kubernetes cluster
- Involved in design and development of Operational data store and data grid to create in-memory Data Grid to store over 3TB of data by gathering the data from over 130 different application and then serve it as a consolidated API to the various consumer facing website and B2B applications.
- Solution design and development for creating Grid Data Distribution network and routing infrastructure (according to SLA) to serve over 5 million messages per second and distribute across 5 data centers using pluggable connector architecture supporting Goldengate, Connect:Direct, SFTP, JMS Oracle Database, SQL Server, MySQL, Postgres and JDBC data capture protocols.
- Design and developed data grid generic framework by gathering the data from various data source
- Design and developed services to persist and read data from Hadoop, HDFS, Hive and writing Java based MapReduce batch jobs using Hortonworks Hadoop Data Platform
- Developed monitoring, deployment and performance gathering modules using Scala, Akka library
- Creation of SOAP and RESTful services using OWASP security patterns in J2EE platform
- Integration services with internal and external vendors/partners using SOA design patterns
- Automation of build and deployment process for multiple systems using SVN/Maven/AntHillPro. Integration with Sonar
- Creation of data aggregation and pipelining using Kafka and Storm
- User profile and other unstructured data storage using Java and MongoDB
- Developed core modules for user analytics, prediction engine using Scala.
- Integration of Elastic Search, Logstash and Kibana for live log analysis
- Development and deployment of Streaming and analytics data on Amazon Cloud
- Integrated and used various open source dev and ops tools/libraries from Netflix (Simian Army, RxJava, Hystrix, Asgard) and Linkedin(Kafka)
- Involved in the installation, configuration and tuning of Hortonworks Hadoop ecosystem
Confidential, LA, CA
LEAD Java ENGINEER
- Factory Configurator solution is a strategic enterprise initiative that enables Configurator Administrators across different groups and regions to build vehicle configurations through a flexible and maintainable system that simplifies the definition and publishing of configurations and their associated attributes and images. Mentor and lead technical and QA team members.
- Led solution design and development, for Toyota Motor Sales enterprise wide Vehicle configurator web application for Toyota, Lexus and Scion brand.
- Designed and developed various integration points with Toyota’s existing legacy services like dealer locator, inventory and factory IBM mainframe data, LiveSite marketing data, TFS financial services, Oracle OAM security, Oracle OVD and various enterprise wide and consumer facing applications
- Worked on web and middle tier performance tuning and JVM sizing
- Developed factory configuration storage engine using MongoDB
- Processing model and workflow development using Lombardi and SonicESB
- Development and deployment of Image creation on AWS Cloud
- Developed RESTFul services using Scala, Akka for providing the factory configuration data
- Apply OWASP rules to create secure application by integrating with HP Fortify security product
Confidential, Los Angeles, CA
Technical LEAD /ARCHIECT
- Responsible for architecture, design and development of Enterprise Wide Advanced Trading Platform (OMS and EMS) for trading Fixed Income and derivate suite of products. Responsible for architecture, design and development for the application middle tier services for trade capturing, position management, trading strategies, portfolio compliance, trade and Risk Analytics.
- Implemented SOA architecture to integrate both internal front office, back office systems and external vendor applications
- Responsible for design and development of EMS from vendors like Bloomberg, Lehman, Tradeweb and Market Axess using FIX 4.4 protocol and Cameron FIX engine.
- Development and deployment of Analytics engine on Amazon Cloud
- Used Java 1.7, J2EE, Spring, EclipseLink JPA and JAX-B, EJB, JMS, SOAP, SOA, Java-WS, Java-RS, Weblogic 10.3.4, Sybase, Eclipse, Unix, SonicMQ, TangoSol, Apache, Tomcat, EhCache, Terracotta, Gridgain, JSP, Struts, Servlet, JUnit, Eclipse 4.1, Java Mail, JAX-WS, JAX-RS, JAX-B, SOAP 1.3, Web services, ANT, Maven, SVN, Swing, SonicMQ, SonicESB, XML, Java Webstart, JDBC, JIDE, JMS, Hibernate, AntHillPro
Confidential, Los Angeles, CA
Senior DEVELOPER (CONSULTANT)
- Development of trading and hedging using C#, Winforms, MSSQL, .Net platform
Confidential, Los Angeles, CA
Senior JAVA DEVELOPER (CONSULTANT)
- Responsible development of Enterprise Wide Advanced Trading Platform (OMS and EMS) for trading Fixed Income assets.
- System supports both pre-trade and post-trade compliance (using Charles River CRD) and order management (Trade Capture, Execution, Enrichment and Validation.
- Using Java 1.6, J2EE, Spring, EclipseLink JPA and JAX-B, EJB, JMS, SOAP, SOA, Java-WS, Java-RS, Weblogic 10.3.4, Sybase, Eclipse, Unix, SonicMQ, TangoSol, Apache, Tomcat, EhCache, Terracotta, Gridgain, JSP, Struts, Servlet, JUnit, Eclipse 4.1, Java Mail, JAX-WS, JAX-RS, JAX-B, SOAP 1.3, Web services, ANT, Maven, SVN, Swing, SonicMQ, SonicESB, XML, Java Webstart, JDBC, JIDE, JMS, Hibernate, AntHillPro
Confidential, Brea, CA
- Responsible for design and development of Enterprise Wide Order Processing, Accounting and Compliance Management System (OPAC). Solution architected using Java Swing front end with J2EE middle layer, including EJB, XML JMS, Tibco RV on Weblogic Application Server (5 and 6.1) and Sybase.