- Overall 8+ years of experience in IT industry which around 3 years of experience in Big Data in implementing complete Hadoop solutions.
- Working experience in using ApacheHadoop ecosystem components like Map Reduce, HDFS, Hive, Sqoop, Pig, Oozie, Flume, HBase, and Zoo Keeper.
- Strong experience in data analytics using Hive and Pig, including by writing custom UDFs.
- Performed Importing and exporting data into HDFS and Hive using Sqoop.
- Knowledge of job workflow scheduling and monitoring tools like Oozie and Zookeeper.
- Knowledge of creating Map Reduce codes in Java as per the business requirements.
- Extensive knowledge in using SQL Queries for backend database analysis.
- Expertise in Core Java and Product Lifecycle Management tools.
- Experience in developing multi - tier JAVA based web application.
- Good Experience in developing applications using JavaEE technologies includes Servlets, Struts, JSP, and JDBC.
- Well-versed in Agile, other SDLC methodologies and can coordinate with owners and SMEs.
- Experienced in creating and analyzing Software Requirement Specifications (SRS) and Functional Specification Document (FSD) .
- Strong knowledge of Software Development Life Cycle (SDLC).
- Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.
- Worked extensively on Health and Automotive Insurance domains.
- Experienced to work with multi-cultural environment with a team and also individually as per the project requirement.
Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop, Zookeeper, Oozie, and Flume
Programming Languages: Core Java, JSP, JDBC
Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server, MongoDB
Tools: Eclipse, JDeveloper, JUnit, Ant, MS Visual Studio
Platforms: Windows 95/98/2000/XP/Vista/7, LINUX
Application Servers: Apache Tomcat, Jetty
Testing Tools: NetBeans, Eclipse
Methodologies: Agile, Scrum and Waterfall
Confidential, Bloomington, IL
Roles and Responsibilities:
- Understanding business needs, analyzing functional specifications and map those to develop and designing MapReduce programs and algorithms.
- Execution of Hadoop ecosystem and Applications through Apache HUE.
- Optimizing Hadoop MapReduce code, Hive/Pig scripts for better scalability, reliability and performance.
- Developed the OOZIE workflows for the Application execution.
- Feasibility Analysis (For the deliverables) - Evaluating the feasibility of the requirements against complexity and time lines.
- Performing data migration from Legacy Databases RDBMS to HDFS using Sqoop.
- Writing Pig scripts for data processing.
- Implemented Hive tables and HQL Queries for the reports. Written and used complex data type in Hive. Storing and retrieved data using HQL in Hive. Developed Hive queries to analyze reducer output data.
- Highly involved in designing the next generation data architecture for the unstructured data.
- Managed a 4-node Hadoop cluster for a client conducting a Hadoop proof of concept. The cluster had 12 cores and 3 TB of installed storage.
- Developed PIG Latin scripts to extract data from source system.
- Involved in Extracting, loading Data from Hive to Load an RDBMS using Sqoop.
Environment: CDH4, HDFS, Map Reduce, Hive, Oozie, Java, PIG, Shell Scripting, Linux, HUE, Sqoop, Flume, DB2, and Oracle 11g
Confidential, Louisville, KY
Roles and Responsibilities:
- Explored and used Hadoop ecosystem features and architectures.
- Worked closely with business team to gather their requirements and new support features.
- Developed Map-Reduce jobs for Log Analysis and Analytics.
- Wrote Map-Reduce job to generate reports for the number of activities created on a particular day, during a time interval etc. for the Analytics module.
- The MR Job read the data from HDFS, where the data was dumped from the multiple sources and the output was written back to HDFS.
- Configured Sqoop and developed scripts to extract data from MySQL into HDFS.
- Used Hive for analysis of web site traffic.
- Wrote programs using scripting languages like Pig to manipulate data.
- Implemented the workflows using the Apache Oozie framework to automate tasks.
- Wrote Hadoop Job Client utilities and integrated them into monitoring system.
- Reviewed the HDFS usage and system design for future scalability and fault-tolerance.
- Prepared Extensive Shell scripts to get the required info from logs.
- Performed white box testing and monitoring all the logs in Dev and Prod environments
Environment: HDFS, Map/Reduce Java, Sqoop, Pig, Hive, Oozie, Flume, Core Java, Nexus, Apache Derby, MySQL and Linux.
Confidential, Bloomfield, CT
Roles and Responsibilities
- Responsible for understanding the scope of the project and requirement gathering.
- Review and analyze the design and implementation of software components/applications and outline the development process strategies
- Coordinate with Project managers, Development and QA teams during the course of the project.
- Used Spring JDBC to write some DAO classes to interact with the database to access account information.
- Used Tomcat web server for development purpose.
- Involved in creation of Test Cases for JUnit Testing.
- Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
- Used CVS, Perforce as configuration management tool for code versioning and release.
- Developed application using Eclipse and used build and deploy tool as Maven.
- Used Log4J to print the logging, debugging, warning, info on the server console.
- Extensively used Core Java, Servlets, JSP and XML
Environment: Java1.5, J2EE, XML, Spring 3.0, Design Patterns, Log4j, CVS, Maven, Eclipse, Apache Tomcat 6, and Oracle 11g.
- Interacting with the client on a regular basis to gather requirements.
- Understanding the business, technical, and functional requirements.
- Checking for timely delivery of various milestones.
- Using Spring Framework, Axis, developed web services including design of the XML request/response structure.
- Implemented Hibernate/Spring framework for Database and business layer.
- Configured Oracle with Hibernate, wrote hibernate mapping and configuration files for database processing (Create, Update, select) operations.
- Involved in creating Oracle stored procedures for data/business logic.
- Created PL/SQL stored procedures for Contract generation module.
- Involved in configuring and deploying of code to different environments Integration, QA and UAT.
- Preparing and designing system/acceptance test cases and executing them.
- Created ant build script to build Artifacts.
- Worked on fine-tuning the response time of Web Service components.
Environment: Java, JSP, EJB, Servlets, Struts, Tomcat, Web logic, Oracle 10g
- Designed User Interface using Java Server Pages (JSP) and XML.
- Developed the Enterprise Java Beans to handle different transactions such as online funds transfer, bill payments to the service providers.
- Implemented Service Oriented Architecture (SOA) using JMS in MDB for sending and receiving messages while creating web services.
- Worked on Web Services for data transfer from client to server and vice versa using SOAP, WSDL, and UDDI.
- Involved in testing the web services using SOAP UI
- Extensively worked on JMS using point-point, publisher/subscriber messaging Domains for implementing Exchange of information through Messages.
- Object Oriented Analysis and Design using UML include development of class diagrams, Sequence diagrams, and State diagrams and implemented these diagrams in Microsoft Visio.
- Developed action classes and form beans and configured the struts-config.xml.
- Involved in writing client side validations using Java Script.
- Involved in the design of the Referential Data Service module to interface with various databases using JDBC.
- Prepared deployment plans for production deployments.
- Prepared documentation and participated in preparing user’s manual for the application.
- Attending the SFD, project kick-off meetings and ST meetings.
- Contributed in designing of test plans & test cases.