We provide IT Staff Augmentation Services!

Sr. Big Data Admin Resume



  • Over 20+ years of strong experience in all aspects of Software development methodology including Requirement Analysis, Design Specification, Implementation, Big Data (Hadoop, NOSQL technologies), Re - Engineering client-server applications, Testing and deployment of Enterprise Applications, Web based Applications, SOA applications and Client/Server based Applications using Java(J2SE)/J2EE on UNIX OS/390 and Windows platforms, Using Oracle,DB2,SQLServer, Informix Databases.
  • Worked on 250 plus Hadoop Cluster environment & 20 Cassandra Nodes, 2 Data Centers Environment.
  • Experience in Hadoop, Map Reduce, Cassandra, Hector API, Datastax API
  • Good knowledge and experience using Isilon 7.x,8.x
  • Very Good experience with Cassandra (open source) and Datastatx4.x/5.x Enterprise version.
  • Good experience in installing Hadoop clusters using Isilon.
  • Working with team, mentoring Hadoop admins,database support teams and report same higher to management.
  • Working product vendors ( Hortonworks, mongo, oracle, Isilon(EMC) on issue resolution.
  • Good Knowledge using Pivotal Hadoop and Horton works cluster installation and operations.
  • Experience in using Java &J2EE Patterns.
  • Good Knowledge in Spark and Scala & Chef Scripting.
  • Good knowledge and experience using Kerberos, Sentry, Ranger and KNOX.
  • Very good Experience using Hadoop, Map Reduce, Hive, HiveQL, Cassandra.
  • Good knowledge and experience in HBASE.
  • Good experiencing in setup Hadoop Clusters ( Hortonworks & Cloudera)
  • Good Knowledge and experience using Kerberos, Centrify, Knox, Ranger.
  • Good Knowledge and experience using Mongo database, Greenplum.
  • Good experience using ESXi 5.x./6.x,Vsphere
  • Good knowledge and experience using Isilon (HDFS) storage.
  • Very good experience setup Hadoop clusters using Isilon storage.
  • Solid background in Object-Oriented Analysis and Design (OOAD), Object Oriented Programming (OOP), Agile Methodologies and Unified Modeling Language (UML).
  • Having excellent 13 plus years of hands on experience on J2EE technologies Java, JDBC, JSP, and Servlet, EJB, XML, JMS, Java Mail API, and J2EE frameworks
  • Very good Working experience on Application servers WebSphere 3.x/4.x/5.x/6.x, WebLogic 7.x/8.x/9.x, IIS, JBOSS, and JRUN.
  • Good Knowledge in IPHONE / IPAD Applications development using Objective-C.
  • Good Knowledge & Experience in using search engines SOLR/Lucene, Juru/Trevi( Confidential ).
  • In depth knowledge of architecting and creatingCassandradatabase systems.
  • Good knowledge in HQL, Hibernate3.x, AJAX, JMS, MQ, Ant, Junit, Open makes Build Management
  • Good knowledge in installation and configuration of WebSphere, Apache / Tomcat on UNIX, RedHat Linux.
  • Very good Experience in JVM Prepared Statement cache Analysis & Performance tuning.
  • Very good Domain experience with Manufacturing, Telecommunications, Financial, Retail.
  • Experience working with Service Oriented Architecture (SOA)
  • Working experience on Os390, IMS.
  • Good Experience in using Visual age for Java, WSAD4.x/5.x/Rational Application Developer.
  • Experienced in Oracle 6.x/7.x/8.0/8i/9.x/10g systems, SQL*Loader, Export, Import and SQL*Plus
  • Experience in working with CGI programming with Perl
  • Strong background in C, Object Oriented C++ programming
  • Able to work without any supervision and also mentoring and leading teams.
  • Experience in developing in Soap & Restful Web services.
  • Good Knowledge in Quartz scheduler, Oozie.
  • Good Knowledge Business objects, Cognos, Crystal Reports and Informatics.
  • Experience in working with MS SQL Server, MySQL, DB2 (UDB, z/os), and IMS.
  • Extensive Experience in SunSolaris,SCOUNIX,HP,AIX4.x/5.x,LINUX( Redhat,CentoOS5.x/6/x,Ubuntu)
  • Excellent analytical, problem-solving, and decision-making skills


Hardware: Sun Sparc (Enterprise 450, Ultra Enterprise 6500), HP (9000 series)

Operating Systems: WinNT, WinXP, Win 2K, UNIX, SUN SOLARIS 2.x/Linux

Languages: C, C++, HTML, DHTML, XML, UML, SQL, PL SQL, Java, JDBC, Corba, Perl, TCL and FORTRAN, PascalScripts JavaScript, VbScript, Awk, Unix Shell Scripting

Server side: Servlets, JSP, JDBC, JMS, Java Mail

RDBMS: Oracle 7.x & 8.x, SQL Server 6.5 / 7.0/2000 , DB2

Front Ends: Developer 2000, Visual Basic 5.0, WSAD 5.0, VisualAge for Java

Middleware: RMI, EJB, CORBA.

Frameworks: Hadoop, Cassandra, Hector API.

Internet Tools: HTML 3.2, 4.0, DHTML (CSS AND JavaScript), Rational rose

HTML Editors: Front Page Express, Dreamweaver, Visual Interdev, and Pro97

Testing Tools: Win runner.


Confidential, GA

Sr. Big data Admin


  • Create local repos for HDP (2.3.6 &2.6.2) and HDF ( clusters.
  • Installed and configured MYSQL 5.1.x for HIVE and oozie metadata.
  • Installed and configured DEV 25 node, UAT (40), PROD( 92) Active, & passive Horton clusters
  • Configured Ambari with LDAP groups.
  • Performance turning HIVE, TEZ and MR jobs.
  • Installed configured 3 EDGE nodes on all environments.
  • Installed configured HUE on dedicated nodes.
  • Working with Mongo team to map HDFS Data for mongo JSONs
  • Installed certificates on Hadoop data nodes for data transfer from Hadoop to MongoDB.
  • POC on Ambari upgrade from to Ambari 2.5.2.
  • POC on Hadoop upgrade from 2.3.6 to HDP 2.6.2.
  • Upgrade Ambari 2.2.1 to Ambari 2.5.2 on DEV and UAT environments.
  • Upgraded HDP from 2.3.6 to 2.6.2 on DEV and UAT environments
  • JDK upgraded from 1.7.x to 1.8.x.
  • Working with application Development teams on performance turning on HIVE applications.
  • Worked with application teams on analyzing job failure logs
  • Helping teams migrating from MR jobs to TEZ jobs
  • Installed configured HDF clusters (Kafka, Nifi, etc.,) on QA, UAT, PROD Active and passive environments.
  • Helping NIFI developers on separating nifi repos to multiple disks
  • Monitoring Hadoop clusters to make sure highly available.
  • Tuning memory parameters for Hadoop containers
  • Working with Hadoop vendor (Hortonworks), creating tickets, working with Hortonworks engineers to resolve issues.
  • Migrated one old Pivotal (old HDP) clusters to HDP (2.3.6)
  • Creating user space allocating for users
  • Helping development teams to convert bedrock (third party tool) workflows to oozie workflows.
  • Currently started working on encryption POC with third party software.
  • On call support.
  • Build configure virtual machines,OS for Kafka and nifi using ESXI & vCenter
  • Working with team on PowerShell scripts to automate building VMS.
  • Upgrade Data nodes OS from RHEL 6.8 to 6.9.
  • Configuring servers in DNS

Environment: Hadoop (HDP 2.3.6 & 2.6.2) Java1.7.x & 1.8.x, Bedrock, MongoDB, Linux (Red Hat 6.8 *6.9), Hive,Greenplum,MongoDB, Pig, Spark, ESXI, DNS, Active Directory (AD). Isilon(oneFS) 7.x,

Confidential, Chicago, IL

Hadoop Admin


  • Installed and configured DSE with multiple Datacenters.
  • Design Cassandra Column families as per services requirement.
  • Working with Development teams on Design and Solutions.
  • Working with third party product vendors ( Confidential & other) to integrate their products into application.
  • Worked with Datastax support team on production issues.
  • Tuning REST services.
  • Working with Performance testing team to benchmark services.
  • Research on new Bigdata products and recommendation to team.
  • Helping development Team on Java API and coding standards.
  • Working with onsite & offshore managers on designing tables, specific to analytics.
  • Design Solr Schema and created the same on Cassandra
  • Cluster Monitoring using Opscenter, JVM monitoring tools
  • Technology Recommendations for future support.

Environment: Hadoop (HDP), Datastax Cassandra-5.x, SOLR, Java, Linux (Red Hat), CQL, Eclipse, Opscenter, tomcat. Apache Kafka.

Confidential, Bentonville, AR /Atlanta, GA

BigData Admin & Architect (Cassandra)


  • Involved in the Detailed Design of the application.
  • Detailed design for REST Services, dynamic REST Services
  • Integration with Apache Camel.
  • Load processor Performance
  • Cassandra export using REST services for external clients.
  • Involved in design for legacy client for Cassandra Data access (Perl, shell)
  • Setup & upgrade Cassandra from 4.0.3 to 4.6.0, Hadoop & SOLR
  • Authentication and Authorization setup on lower and Production Environment.
  • Created users and roles for all keyspaces.
  • Involved in discussions on Data Validation and identifying the attributes that need to be migrated from HOBUS.
  • Column families Design and creation on lower environment.
  • Performance tuning application level (services) and Cassandra.
  • JVM tuning.
  • Setup alerts, emails using Opscenter.
  • SOLR core creation for all column families/keyspaces.
  • SOLR schema Design.
  • POC on Dynamic Services for faster retrieval.
  • Script developed for backup and Restore.
  • Cluster Monitoring using Opscenter and cluster health reports.
  • Scripts developed to export data Using HIVE.
  • Script developed for loading data using Pig scripts.
  • Data export using Hive for data quality and governance.
  • POC SOLR Scalability
  • Script designed and developed to unload and load data from Production to lower environment.
  • New Datacenter setup with Hadoop Nodes.
  • History column families design and script development.
  • OS Page cache POC setup on lower and production environment for Performance.
  • Opscenter center on new cloud servers.
  • Solr POC for complex data types.
  • Scrum meetings by demonstrating close interaction with the other team members.
  • Coordinating the development effort Confidential offshore and managing the delivery of the project by periodical code reviews.
  • DAO POCs to avoid tombstones(Cassandra)
  • Production Deployment

Environment: Datastax Cassandra 4.0.3/4.6.0 , Hive, Pig, SOLR, Java 1.7.x/1.8.x,Camel, Linux (Red Hat)Mainframe (DB2), Informix, CQL, Eclipse, Opscenter 5.x.tomcat 7.x

Confidential, Atlanta, GA

BigData Consultant Admin


  • Design And Develop No SQL solutions using Cassandra.
  • DAO development for using java Driver for application BenchMark.
  • Column Families Design and Development.
  • Unix script development for maintenance Activities
  • Setup 4 performance Clusters setup, 2 UAT clusters, 10 QA clusters and 2 Production clusters (RTP / Confidential ) setup for product catalog & inventory Services.
  • Cassandra Authentication Authorization setup for clusters
  • Datastax/Cassandra audit setup for Audit trail /Cassandra connections usage.
  • Working with Datastax support team on Service tickets and
  • Mentor colleagues and Development Team.
  • Keyspaces & Column Families verification and Approvals
  • Opscenter installation and monitoring setup group.
  • Opscenter Alert setup and cluster monitoring.
  • Backup and recovery using both Opscenter UI and Opscenter Http calls ( REST API)
  • Creating Column famines and permission & security for each release
  • Cassandra performance tuning (JVM & OS level)
  • Production support 24/7
  • Datastax Enterprise Java Driver driver verification and recommendations
  • Helping chef scripting team for Cassandra chef scripts.

Environment: Java 1.6 & 1.7.x,Web Services, Cassandra 1.2.10, Datastax 3.1.4 & 4.0.3, Opscenter 3.2.2, 4.1, Hector, Datastax Java API,Linux 5.x & 6.x,Chef Solo. REST API.CQL.

Confidential, Atlanta,GA

Cassandra & Hadoop Consultant Architect


  • Data extraction from Teradata using sqoop
  • Unix scripts using Confidential & Confidential internal Unix framework
  • Hive schema Design, extraction and loading from landing and Stage.
  • Design And Develop No SQL solutions using Cassandra.
  • Design Column Families for Grid Cache (Enabler &MyATT/ Uverse) feeds.
  • Mapping data feeds to Cassandra Column Families.
  • Performance tuning and Analysis.
  • POC with internal frameworks(GRID framework and GDDN)
  • Developed DAO using Kundera API (GDDN & GRID API) for Enabler data
  • Developed DAO using Kundera API (GDDN & GRID API) for Myatt/Uverse data.
  • Unit testing with GRID API.

Environment: Java, Cassandra 1.2.4, Hector API.Hadoop.Linux.

Confidential Atlanta, GA

Technical Lead/ Hadoop Admin


  • Designed ESVS Hadoop components of the project.
  • Drive the implementation of solutions by leading and mentoring peers.
  • Evaluation of third party products, Open source APIs for inclusion.
  • Developed POC for Bulk load process for loading from Hadoop to Cassandra Map Reduce job.
  • Helping infrastructure team for new OS version change on data nodes.
  • Design Hive tables & tuning HiveQL.
  • Design Restful web services to interact with Cassandra.
  • Design and developed DAO POC using Hector API.
  • Cassandra installation, setup and read/write configuration on QA, QP and Production environments.
  • Disaster and recovery setup and execution on multiple data centers.
  • Building POC for Customer Data retrieval from Random and Byte order cluster to replace Siebel model.
  • Complete Design and develop of Cassandra Data model.
  • POC for different components including composite columns
  • Setup QA Environment.
  • Cassandra Performance tuning for Hector API reads & writes as per SLA.
  • Cassandra performance tuning for Hadoop Map Reduce jobs.
  • Designed Content grid Hadoop components of the project.
  • Developed POC to move .CSV files to Hadoop gateway server using shell script and Hadoop client
  • Designed and developed Hive job to merge incremental file.
  • Involved to design to moving files from Hadoop to Tableau server for Dashboard application.
  • Initial setup to receive data from external vendor.
  • Configure Hadoop Client on external gateway machine.
  • Analysis and design on production views
  • Stage tables design.
  • MR job POC using java & hive
  • Cassandra performance turning.
  • Involved in the Design, developing backend reusable libraries (DAO) for Restful & Enterprise Services.
  • Evaluation of third party products, Open source API for inclusion.
  • Install and configure SOLR/lucene Search Engine in AD and QA stores.
  • Designing SOLR schema as per the product attributes.
  • Design and implement correct data Structures to map SOLR API with DAO framework.
  • Reviewing the code which is developed by team members and offshore team.
  • Developed DAOs using Home depot Custom DAO framework and JDBC.
  • Helping offshore team members to develop high performance SQLs for database search for latest product price.
  • Working with Enterprise architects to design and implement and requirements.
  • Developed DAO methods for this tool to support both DB2/zos (Host) and Informix (all stores).
  • Developed and Restful Services to access DAO libraries.
  • Implemented security mechanism using Ldap groups.
  • Developed scripts to access to restful services from shell scripts using Wget in secured way.
  • Code review which is developed by offshore team.
  • Create RPMs using HP Opsware package for Production Deployment.
  • Developed shell scripts for RPM packaging and deployments.
  • Leading a Team of 7 offshore resources.
  • Participated in requirements gathering, involved in Database Design and Data modeling.
  • Participated in the architectural design to and Performance Tuning.
  • Involved in Performance Tuning setup Using JVM Parameters, Prepared Statement cache Analysis.
  • Involved in the Design, developing backend reusable libraries (DAO) and Restful & Enterprise Services.
  • Reviewing the code which is developed by team members and offshore team.
  • Developed DAOs using Home depot Custom DAO framework and JDBC.
  • Helping team members to develop high performance SQLs.
  • Created Plan document for Production .Deployment
  • Create RPMs using HP Opsware package for Production Deployment.
  • Developed shell scripts for RPM packaging and deployments.
  • The On-Hand Adjustment Common Service will be responsible for changing the on-hands for skus and handling the on-hand update approvals and rejections. The service is designed to replace the existing imp306 Informix program in favor of a highly accessible common service on the store.
  • The Common On-Hands Adjustment Service will also write to the Inventory Movements Informix on-hands delta tracking table. A batch process (See Inventory Movements Interface Design Specifications) will be responsible for polling the delta table and relaying the offsets to the Sterling Inventory system.
  • Help store associates close sales by providing quick access to pricing and availability of YOW products
  • Help stores provide better customer service with quicker access to confirmations and tracking information
  • Increase overall process efficiency for the nearly one million Special Orders placed annually by The Home Depot through YOW.

Environment: J2EE,JSP,JDBC, DAO Framework,Adobe Flex 4.x Java Beans, Servlets, JSP,JAXB & Xstream, XML,Eclipse3.x, Oracle 10g, Restful Web Services, Java Mail, Clear Case, Linux 6.x,HP Opsware. Tomcat 6.x, JRUN.

Confidential, Atlanta, GA

Technical Lead and Sr. Developer


  • Created new conceptual model for new Release. And involved in Database Design.
  • Involved in Analysis, Design, developing backend programming for Technician process(batchprocess)
  • Worked on performance improvement on existing software.
  • Part of group who designed business logic modules considering J2EE patterns.
  • Developed DAOs to support RAC.
  • Rewriting on existing DAO to support Batch updates and inserts.
  • Production support for existing applications.

Environment: J2EE,JSP, Java Beans, Servlets, JSP, XML, Rational Application Developer/Eclipse, Oracle 9i/10g, Stored Procedures, WebServices, JavaMail,Windows 2000, Solaris9.x/10.x, Shell scripts,Web logic9.x,Apache2.2.8 & Tomcat 5.x.

Confidential, Birmingham, AL

Sr J2ee Developer /Team Lead


  • Involved in Analysis, Design, developing uses cases, sequence diagrams, class diagram, and flow charts.
  • Responsible to support the product on multiple Application servers like Websphere, Weblogic, JBoss and interoperability on different platforms
  • Part of group who designed business logic modules considering J2EE patterns.
  • Developed Stateful, Stateless Beans as business logic
  • Migration from Oracle 9i/10g to Sql Server 2000/ 2005
  • Active Participation in performance improvements and security concerns for the product
  • Application Server administration on different servers like Weblogic, Websphere, JBoss
  • Defined Queues (Local and Remote), Connection Factories, and Channels on Websphere MQ and on JMS (for testing) and integrated with WSAD.
  • Maintenance of Code Repositories like CVS and Subversion
  • Parsed, Validate, Build XML, Build Value Objects from XML using SAX, DOM parsers
  • Production support for existing applications.

Environment: J2EE, EJB, JSP, Java Beans, Servlets, JSP, AJAX, JMS, XML, Websphere MQ 5.0, My Eclipse, WebSphere Studio Application Developer/Rational Application Developer, Oracle 9i/Sqlserver 200/2005, Stored Procedures, Web Services, JavaMail, Subversion, Windows 2000, Solaris, Weblogic, Websphere 5.x, JBoss.

Senior Developer



  • Interpreting requirements and implementing them in a user-friendly application that meets the needs of the users.
  • Developed pojos, DAOS reverse engineering with the newly created tables using myeclispe
  • Development of middle-tier data access components using spring and Hibernate frameworks.
  • Implemented persistence layer of the project using Hibernate.
  • Implemented Lazy loading, rolling back of transactions by configuring Hibernate.
  • Participated in development of business logic code using spring.
  • Integrated LDAP authentication for login.

Environment: Java 1.4, J2ee, Struts, Jsp, Servlets, JDBC & Hibernate 3.0, Spring, (Tomcat 5.0 Test environment), Weblogic 8.x, Websphere6.x, XML, Sub version, Oracle 10g on Windows NT/Unix and MS SQL Server 2000/2005.

Confidential, Austin, TX

Senior Developer


  • Installation and Setup (Websphere & Confidential HTTP) for Test Server on Linux Environment.
  • Worked on Legacy Interfacing (entire backend processing (indexing, categorization))
  • Wrote a controller to invoke all manual processes for automation.
  • Developed Helper Classes and Value Objects (java beans) which interacts with business objects.
  • Wrote beans to parse and load the crawled docs (XML docs) into db2database.
  • Wrote automator to run taxonomer, indexer, and categorizer using java threads.
  • Testing, deployment and documentation
  • Prepared user guide and Installation Document.
  • Assisting & fixing problems in production.

Environment: WSAD5.1, Java1.3.1, SQL, DB2, JDBC, Servlets, Jsp, Html, JavaScript, Style sheets, UNIX (AIX4.x/5.x), Red Hat Linux 7.3, Log4J, Rational Application Developer UML, Struts, XML.

Search Engines: JURU/TREVI, WebServices (Tomcat/Axis), CMVC/CVS.

Confidential, Peoria, Illinois

Senior Developer


  • Participated in design, development, testing of TMI using Java, EJB, JSP, Servlets, XML, HTML, JavaScript, JDBC, TUFF Framework (MVC Pattern), WSAD, Websphere
  • Involved in design, development of technical specifications, application code, programming specifications from functional specifications and Class Diagram, Sequential Diagram.
  • Gathered requirements, analyzed use-cases and developed specifications
  • Developed extensive of server-side code in J2EE conforming to the TUFF framework and the Model-View-Controller (MVC) architecture.
  • Created XML files from flat files produced by JCL.
  • Created XML files from flat files produced by JCL.Transformed XML to XML of different Schemas using XSLT.
  • Implemented e-mail in the application using JavaMail and other utilities
  • Generated reports by using Cascading Style sheets.
  • Developed extensive client-side code in JSP and JavaScript
  • Developed UNIX CronTab shell scripts for uploading and deleting the files from different databases
  • The ICCP Web Based system allows dealers to register projects claim rewards online. However, the biggest advantage of the system is the project coordination tool build within it. This application gives influencing dealers the ability to register a project, enter project data and activity. And then select other dealers they want to view the project’s details
  • I had exposure to the OS/390 since the Web logic environment was set on this box. I was highly involved in deploying the application on the OS/390 environment.
  • To track the employ ratings of Confidential so that upper Management could make the decision on how to compensate employee and also tracks employee movement from one division to other division or different Job, with web interface including graphical.
  • Email notification to supervisor indicating when review should take place. This application behaves in different way if the supervisor logs in all the employees information is accessible and if divisional manager logs in all division employees information is accessible using LDAP.

Environment: Java, JDBC, Servlets, Jsp, Visualage for Java 3.5/4.0,OS390, DB2, Oracle8.1, LDAP, Html, Javascript, SQL, Style sheets UNIX (Sun Solaris)

Webserver /AppServer: Iplanet webserver5.2 (Netscape Enterprise Server), Websphere3.5


Java/Perl Developer


  • Text files are created each night in batch process from CEOPS ( Confidential Engine Order Process System) and ftp to the Webserver using JCL and where a UNIX shell script (Cron Job) is run to perform an update of the Oracle database. These text file updates the Oracle table everyday.

Environment: Perl 5.2, CGI, UNIX (SunSolaris2.8), Oracle8.1, LDAP, Shell, Scripting, Html, Javascript, Style sheets, UNIX (Sun Solaris), Iplanet webserver5.2 (Netscape EnterPrise Server).

Hire Now