We provide IT Staff Augmentation Services!

Senior Hadoop Developer Resume Profile

New York U, S

Professional Profile

  • Extensive experience of 14 years in Information Technology focusing on all aspects of SDLC
  • Hands-on and having passion for Big Data / Hadoop Eco System design, plan, cluster setup, analysis, development and administration using HDFS, MapReduce, Yarn, Weave, Hbase, Hive, Pig, MongoDB, Hazelcast, Mahout, Storm, Flume, Sqoop, Oozie, ZooKeeper, Ganglia, Ambari and RHadoop Packages from Open Source to Commercial Distributions like Apache, IBM BigInsights, Cloudera and Hortonworks.
  • Having perfect combination of BigData/Hadoop, Java JVM,Core,Multi-threading , Database, J2EE JMS, WS, Spring, Hibernate, Scripting language and BI Analysis technology experience as Data Scientist cum Evangelist
  • Expertise in developing high traffic, low latency, high throughput systems, cloud computing, distributed systems and caching, java components, server side components and application migration using Java core, multithreading, collection and concurrency , Databases Oracle, Ms Sql, Sybase and DB2 , JMS, Webservice SOAP and REST , Spring, Hibernate frameworks, Data Structures, Algorithms, Object Oriented Design Concepts and Agile methodologies
  • Projects executed for Financial, Investment and Private Banking, Securities, Stock Exchange, Insurance, Telecom and Networking domains
  • Constantly learning and leveraging emerging technologies, adapt to new environments, self-motivated, problem solving skills, team player with excellent interpersonal, technical and communication skills.
  • Contributor who performs confidently and effectively under pressure and thrives on challenge


  • Big Data / Hadoop Core , Data and Operational Services : Hadoop framework HDFS, MapReduce and Yarn , Weave, Hbase, Hive, HCatalog, Pig, MongoDb, Hazelcast, Flume, Sqoop, Oozi, Zookeeper, Ganglia, GIT, Ambari, RHadoop packages, Mahout Machine Learning , Cluster plan, design, installation, setup and administration
  • Big Data Infrastructure : DataCenter, Local, Cloud internal/external design, plan, installation, setup and administration
  • Programming Languages: Java Core, Multi-threading, collection and concurrency , J2EE JSP, Servlet, EJB, JMS, MQ , Apache POI, Ajax, DHTML, HTML and C
  • Web Technologies / Frameworks: Spring Core, AOP, ORM and MVC , Struts, Tiles, JSF, Securant, Site Minder, Hibernate, Web Service SOAP / REST , Active Widgets. Control-M, SOA and Java/J2ee Design Patterns
  • Application / Portal Servers: Web Sphere and Oracle Web Logic Portal and Application Servers, ilog Rules Engine , JBoss and Apache Tomcat
  • Development/Productivity Tools / OS: Spring Source Tool, Web Logic Workspace Studio1.1, Eclipse, Web Sphere Studio Application Developer, Test Director, Windows XP, UNIX, Linux and AIX
  • Databases: Oracle 10g/11g PL/SQL and Stored Procedure , DB2, MS SQL Server, My SQL and Sybase
  • Methodologies/Tools/Technologies: Ant, Maven, PVCS, GIT, SVN, CVS, Clear Case, ERWin, Eclipse, Autosys, JUnit, MS Visual SourceSafe, Cruise Control, Toad, Rational Rose and UML

Professional Experience


Senior Hadoop Developer

Confidential is built as next generation system to simplify, standardize, and enhance the existing technical componentry COMET,ACE,IFRS,LVR,GEMS,STAR,Mercury and Reporting applications within Financial Accounting in-house calculation and reporting platform using Big Data Technologies. This migrated new FinCE platform can deal with 5 x more data volume and increase in the run time speed as 20x faster than the current system for data collection , extraction, calculation, analysis and reporting Financial and Regulatory requirements. Also this results in reduced complexity, implementation and RTB costs, operational risk and time-to-market, while better supporting future regulatory change and facilitating IT strategy. FinCE Computation and Reporting Engine is a powerful platform providing both highly scalable and low cost data storage tightly integrated with scalable processing. FinCE engine helps bank to collect and analyze the data in order to accurately assess risk and market trends along with understanding the implications of massive capital leverage and the bank ability to model and refine liquidity management. The current system before migration gets data for processing from different upstream sources opera v3, opera DAL and Finance Gateway and stage the data in star datawarehouse to process in different calculation engines comet,ace,ifrs,lvr,gems and mercury and sends the results back to datawarehouse and datamart for different Regulatory Group Reg, Asia, US and EUR reg reporting and Financial Liquidity, Disclosures, Consolidation and Federal reporting requirements Confidential is being migrated from existing system as use case by use case under one umbrella called Big Data / Hadoop Eco Systems.

  • Review of functional and non-functional requirements
  • Understanding existing system to come up with migration plan to Hadoop system
  • Design, Development, testing and deployment of new system in Hadoop environment
  • Importing and exporting data into HDFS, Hive and Hbase using Sqoop from Relational Database
  • Extracted files from Hbase and placed in HDFS/HIVE for processing
  • Exported the analyzed data to the relational database / datawarehouse for visualization and to generate reports RHadoop and OBIEE for data visualization
  • Used Hive and Pig to analyze data from HDFS
  • Load and transform large sets of structured, semi structured and unstructured data
  • Hadoop clusters installation, configuration and maintenance for application development local and internal cloud
  • Datacenter infrastructure design and implementation for cluster, hardware rack and servers , network switches and other power and cooling infrastructure related requirements.
  • Designed scheduled workflows for recurring jobs using Oozie coordinator
  • Load log data into HDFS using Flume
  • Developed Map Reduce jobs to implement COMET Batch Processing for all region using HDFS, Hbase and Hive data
  • Development and test environment setup installation and configuration with Hadoop Core, HDFC, MapReduce, Yarn, HBase, MongoDB, Hive, Pig, ZooKeeper, Oozie, Ambari, Java and all other major components used in the project
  • Designed and integrated backend real-time performance metric system using Ganglia
  • Cluster sizing, design, installation, configuration, monitoring, troubleshooting, security, backup, re-sizing addition / deletion of nodes , performance monitoring and fine-tuning
  • As part of COE and Evangelist Team have attended internal / external training followed by hands-on POC to get hold on all the new eco system technologies
  • Actively participate in various internal meet / sessions with other project team members to understand the requirements and suggest suitable Hadoop Stack Technologies along with training sessions
  • Explored different Machine Learning Approaches Mahout , Declarative and Predictive Analysis Tools along with Data Visualization RHadoop packages RHadoop,RHbase etc for the next generation Business Intelligence era
  • Setting up Enterprise Data Hub to make use of the data for different processing needs like batch processing, real time / stream processing, search, structured/semi-structured/unstructured data processing, machine learning, statistical / predictive analysis and data visualization / reporting


  • Core Services : Hadoop, HDFS, MapReduce and YARN
  • Data Services : Hbase, Hive, HCatalog, Pig, Sqoop, Flume and Solr
  • Analytics / Visualization : R, RHadoop packages RHadoop,RHbase,RHDFS etc and OBIEE
  • Operational Services : Oozie, ZooKeeper, Avro, Hazelcast, Ambari and Ganglia
  • Cluster Design and Setup : Datacenter and Internal Cloud Cluster Design and Setup
  • J2EE / DB Technologies : Java JVM, core, multi-threading, collection and concurrency ,Oracle sql
  • and stored procedure ,Unix shell script , Thrift, REST, JSON,
  • Datawarehouse, Datamart, OBIEE, JMS, WebServices,
  • in-house frameworks, GIT and SVN.

Senior Software Engineer

Confidential are recommendations on banking laws and regulations issued by the Basel Committee on Banking Supervisions. The purpose is to create an international standard that banking regulators can use when creating regulations about how much capital banks need to put aside to guard against the types of financial and operational risks banks face. COMET was designed to meet the regulatory requirement and to calculate accurate capital adequacy for the Credit Suisse Group for EBK and FSA regulatory authorities using different approaches. There are three major services involved in COMET core processing like Level1 Trial Balance Trade Enrichment Service, Construct Creation, Aggregation and Capital Calculation to calculate the accurate capital adequacy for different products OTC,ET derivatives, EPE,PE feeds etc of Fixed Income, Loan, Equities etc using Basel Rules like Standardized ConfidentialConfidential application consists of independent loosely coupled components based service oriented architecture like control framework to divide the large volume of trades into smaller chunks and utilizes the multithreaded framework to process these smaller chunks simultaneously amongst all the 24 server instances in the cluster. COMET architecture uses following techniques like threading, clustering, caching, in memory processing, rules engine, control framework, batch framework, calculation framework, web report framework, security framework, database framework and swap partition. Also comet interacts with different upstream and downstream systems as part of the process.

  • Involved in understanding of Comet system architecture SOA and framework implementation control, batch, calculation, web report, security, rules engine, database and swap partition frameworks and discussed Standardized CVA calculation Functional Requirement with Business Analysts to implement CVA rules for FINMA region
  • Worked on Design visio , Technical Specification Document Creation, Development java core, multi-threading, Active widgets, JSP, spring, hibernate, oracle, datawarehouse, datamart, control-m with COMET in house frameworks , Testing and Testing support for Standardized CVA FINMA region implementation for BASEL II and BASEL III
  • Involved in development and testing of CCP Central Counterparty Application using java core, multi-threading, Active widgets, JSP, spring, hibernate, oracle, rules engine, control-m with COMET in house frameworks
  • Involved in quarterly plan discussion for all the FSA region related requirements with Business Analysts, Project Manager and Design and Architect Team from various region.
  • Completed Design, Upstream and downstream interface agreement STAR, data-wareshouse,datamart, OBIEE,AXIOM etc and Technical Specification for Standardized CVA for FSA, Maturity Capping, CoRep Report for FSA involving different teams from across the world.
  • Contributed in Development and Testing for all the FSA projects including COMET core processing like Level1 Trial Balance Trade Enrichment Service, Construct Creation, Aggregation and Capital Calculation working with Development Team in Singapore, Testing Team in Pune, Management Team in London and Design and Architecture in New York.
  • Along with Design and Development of various applications also involved in UAT and Production support of COMET application in New York time like issue analysis and providing fix working with Business Users and Production Support Team
  • Successfully completed infrastructure planning, design and implementation for new Test Environment called Tech Testing for Basel III Application which is production like environment
  • Plan, Design, Setup control-m, java, oracle, weblogic servers and unix, and maintenance of Capital Scenario Analysis environment CSA for the Business Users to Analyze the Production data and calculation result for accuracy.
  • Comet Application Performance Tuning java object pool, threading, rules, oracle and comet batch framework changes to meet the SLA for each region as data volume grows in each region.

Environment: Java core, multi-threading, collection and concurrency , Spring, Hibernate, Apachi POI, Securant, Siteminder, Active Widgets, JSP, Control-M, Oracle SQL and Stored Procedure , SOA, Webservice, Datawarehouse, Datamart, OBIEE, Weblogic Application Server, JMS, Log4j, SVN, Ant, in-house frameworks control, batch, calculation, rules engine, web report, security, database and swap partition frameworks and Unix.


Senior Software Engineer

Confidential Tool is to efficiently process and support Global Corporate Bank Account Opening, Account Maintenance, Account Closure and Signer Management requests submitted via the eBAM Electronic Bank Account Management and non-Confidential channels. Citi Primary Business Users like Operations Personnel, Account Managers, Relationship Managers, Client Service Personnel, OBT Administrators and Operations Managers will use this OnBoarding application to support their corporate clients based on various countries in different regions like North America, CEEMEA, Asia, Latam, and Western Europe. Also OBT supports sending Alerts, Mail, Reports for various information, Document Management and Workflow processing. OBT interatcts with multiple internal systems like WPS, eForms and citiVault to perform above operations. In short, OBT provides functions for clients to electronically submit paper authority account requests via eBAM and non-eBAM channels to reduce the number of days required to perform global corporate bank account opening cycle and cost reduction.

  • Started with analysis and understanding of OBT system and architecture
  • Implemented all the identified server side modules with java,spring,hibernate,oracle, JMS and webservices to pass the message to the front end application written in .net technologies
  • Involved in High Level and Low Level design for the modules identified and planned for each month.
  • Production Parallel test environment plan and setup using websphere, oracle and unix
  • Contributed in production support and current system enhancement using java core, server side, multi-threading, webservices, spring, hibernate, oracle and JMS.
  • Production release plan discussion and support
  • Support Document Creation

Environment: Java, core, multi-threading, collection and concurrency Spring, Hibernate, Oracle PL/SQL and Stored Procedure , WebService, WPS, JMS, Websphere Application Server, Log4j, Ant, and Unix.


Senior Software Developer

Confidential Deal Management Tool is being developed to automate the response management business process in Asset Servicing. DMT creates centralized Asset Servicing Response Management and Deal Balancing application. DMT helps analyst, traders CAAS users and PB account managers to use the systems based on their role and access rights.

  • System is being used to automate the Corporate Action Business Process. CARE process the various types of corporate events announced from issuing company. We have enhanced many features in this application like retiring various components and replacing with latest components.
  • Started with Analysis of Architecture, Application Design and components written using java, spring and hibernate.
  • Subsystems used in CARE Applications are CARE ADP, CARE SSE, CARE T24, PBW, GPA EPIC and RTTP Internal Cloud Computing . Developed new functionality and enhanced existing functionality using core, multi-threading and collection framework from Java and open source framework spring, hibernate along with JMS and Webservices.
  • Batch jobs are written using Autosys.
  • Build script creation and Build and Deploy setup completion for Integration, QA and Production environments.
  • DR Infrastructure setup creation and
  • DMT Support Reference Document created.

Environment: Java core, multi-threading, collection and concurrency , Oracle Sql and Stored Procedure , Sybase, Spring, Hibernate, JMS, Webservice, Cruise Control, Log4j, Ant, Autosys, Tomcat, Unix and Internal Cloud Computing System.


Senior Software Developer

Team is to upgrade all the projects in PL Internet Application domain including Java/J2EE Applications, Portal Applications, Databases and Test Environments from current version to latest version. The Tech Stab team upgrades current technologies to the latest version which includes Java 6, Web Logic Application and Portal Servers 10.3, Oracle Database 11i and all other related technologies as mentioned below. The teams plan to upgrade 15 applications in four different phases. First two phases of delivery includes all Java/J2ee applications built on using core java, multi-threading, collection, concurrency classes, j2ee components and open source frameworks like spring, struts and hibernate. Second and last two phases includes all internet applications built on using web logic portal server 8.1 and 10.2.

  • Started with Analysis of code base upgrade from jdk 1.4 to jdk 1.6 along with components written on using java multi-threading and collection framework.
  • Components include thread pooling, reconciler, scheduler and resource pooling in four different applications. Developed new functionality and enhanced existing functionality using core, multi-threading and collection framework from Java and open source framework spring.
  • As a part of upgrade Co-ordinate with developers, technical leads, QA leads, DBAs, infrastructure manager, release manager, and application managers for planning for all 15 project upgrades.
  • Builds script creation and Build and Deploy setup completion for Integration, QA and Production environments.
  • Infrastructure configuration cookbook created to setup Web Server configuration, Firewall configuration, and Portal/Application Domain creation, Oracle database and Application Configuration in all the environments including Integration, QA and Production. Created project plan and estimates for 15 projects including weekly and monthly status report preparation.

Environment: Java core, multi-threading, collection and concurrency , Spring, Hibernate, Weblogic Portal, Webservices, Oracle 11i DB and Unix,


Senior Java Developer

  • Confidential This is an Enterprise call center ECC which is going to implement its universal Siebel Desktop Application Siebel 7 for customer service representatives CSR to service various line of business such as retail banking, home loan etc.
  • Developed and tested new transaction in java core, multi-threading, collection which interacts with Siebel and back end systems
  • Performed onsite coordination for functional enhancements, application transaction development, and delivery.
  • Performed application support for existing transactions in test environments.
  • Conducted daily and weekly status meetings to follow up on development issues and prepared monthly status report.

Environment: Java core, multi-threading, collection and concurrency , Hibernate, Test Director, Oracle, Enterprise Java Beans EJB , J2EE, Spring, Siebel System, Unix, Windows XP


Senior Software Engineer

  • The maintenance team is called as RTB Run the Business team in offshore. The goal of this project is to provide maintenance support and enhancement for Agency Gateway Portal Application which we developed in web sphere portal. Functionality enhancements are implemented in 4 milestone deliveries.
  • Designed, developed portal application enhancements using Core Java, Multi-threading, Server Side, Front End technologies as mentioned below and unit tested application functionality.

Environment: Java core, multi-threading, collection and concurrency , Web Sphere Portal Server 5.1,MySQL,IBM Rational Application Developer, Interwoven, J2EE, JSF, Web Services, iLog Rules Engine , Training, HTML, Java, XML, Ant, Clear case, Agile, Unix, Windows-XP


  • Migrated Contact Us Module from Microsoft platform to J2EE platform in nationwide.com corporate site.
  • Designed, developed application using Core Java, Server Side, Front End technologies as mentioned below and performed unit testing. Also supported QA testing and Production release implementation.

Environment: Java, J2EE, NW Struts, Tiles, JSP, Ant, CVS, Oracle, Test Director, Documentum and Windows-XP


Senior Software Developer

  • Implemented corporate site with Documentum for content management and NW Struts used for presentation layer. Corporate site implemented in two languages like English and Spanish containing of 900 static pages to display in each Confidential nationwide struts framework along with Tiles to control the front end and Documentum to maintain all the static page contents.
  • Designed, developed application using Java core, multithreading , Server Side, Front End and performed unit testing. QA test support and Production release implementation.

Environment: Java, Oracle 9i, J2EE, NW Struts, Tiles, JSP, Ant, CVS, Documentum and Windows XP


Lead Developer

  • State Street bought Risk Reporting application from Algo system and implemented for State Street infrastructure and business requirements are enhanced as part of this implementation.
  • Performed development using Core Java and Multithreading to enhance and link the algo risk systems with state street application.
  • The infrastructure planning and implementation included development servers, UAT, Production and DR Servers Setup Configured Resin Application Server with Iplanet Web Server and Application Authentication Mechanism, application enhancements, Themes and Skins enhancements.
  • Created build and deployed scripts for algo risk application.

Environment: Java Core, Multithreading, socket , J2EE, XML, Oracle 9i, Ant Build Script, UNIX


Senior Software Developer

The eGovDirect.com portal application enables listed companies to meet NYSE governance and compliance requirements efficiently and economically. Confidential provides NYSE listed companies with an electronic filing platform to submit various corporate and financial reports such as Annual and Interim Affirmations, Annual CEO Certifications, Dividends, Shareholder meeting dates Performs analysis, design, prototype creation for wireframe/pages and development in Java Core, Multithreading , Server Side, Front End development using struts, jsp, html, ejb for Board Maintenance module in this application.

  • Executed in milestone to deliver identified requirements as functionality.
  • Unit testing for each milestone and QA support. Support provided for quality auditing team and Production implementation.
  • Involved in Server hardware procurement and Web sphere Portal server procurement from IBM.
  • Configured hardware servers starting from AIX operating system installation to web sphere portal server along with web server installation in each test servers.
  • Configured application in the entire test environment with portal server support to development team.

Environment: Java, XML, DB2 UDB, Sybase, Web Sphere Studio Application Developer , Enterprise Java Beans EJB , JSP, Java Servlets, JavaScript, Struts, Web Sphere, DHTML, HTML, ER Win, Eclipse, Rational Rose, IBM-AIX, Linux, Windows 2000



  • The main objective of the UAN module is to manage the Incoming and Outgoing Calls and to maintain the Enterprises Level status. UAN is part of an existing Enterprises Application called UNICA.
  • Involved in the Design, Development and Testing of UAN module.

Environment: Java, socket programming, Oracle 8i, JBuilder, Enterprise Java Beans EJB , JUnit, MS Visual SourceSafe, Rational Rose, UML, Windows, ESN and Dopra Application server.

Hire Now