Hadoop, Java Developer Resume
NC
PROFESSIONAL SUMMARY:
- About 8+years of IT experience encompassing SDLC process including requirement analysis, Design, development, testing, change request management and maintenance Enhancement support to the client with strong skills.
- Strong experience with javascript, jquery.
- Developed Restful webservices and written both server side and client side code to produce/consume data.
- Experienced with Java Multithreaded programming to develop multithreaded modules and applications.
- Deep understanding in Multi-threaded programming.
- Expertise in Telecommunications, Banking, Health Insurance, Automobile Domains.
- Expertise in back-end procedure development, for RDBMS, Database Applications using SQL and PL/SQL. Hands on experience on writing Queries, Stored procedures, Functions and Triggers by using SQL.
- Good Knowledge on Java Technologies, spring, hibernate, Core java, J2EE, XML, Struts.
- Experience in performing Peer & Code reviews.
- Involved in unit testing.
- Participated in Project Requirement and planning meetings with the customers.
- Experienced in mentoring teams with functional knowledge and business processes.
- Experience in requirements collection, analysis. Excellent troubleshooting and debugging skills.
- Onsite and offshore team management, task assignment to subordinates and their progress monitoring.
- Effectively involved in application migration related activities.
- Have experience in gathering and understanding the requirements from the client.
- Good debugging and diagnostic skills in batch processing.
- Exceptional ability to learn new technologies and to deliver outputs in short deadlines
- A Strong team player having Good analytical skills to identify key issues and provide solution, design of Technical Specification document as per schedule.
- Research-oriented, motivated, proactive, self-starter with strong technical, analytical and interpersonal skills.
- Involved extensively in training, knowledge transfer sessions to new employees and subordinates to cater to technical and functional requirements of the application.
- Hands on experience with BigData Hadoop echo systems Hdfs, Map Reduce, Sqoop, Hive, HBase
- Hands on experience in writing pig Latin scripts and pig commands.
- Developed data pipelines to import/export structured/unstructured datasets using Sqoop to move data in and out of the Hadoop ecosystem
TECHNICALSKILLS
Relational Databases : MySQL, SQL, DB2, Oracle
Languages : Core Java, XML, HTML5.
Frameworks : Spring, Hibernate, JSF
Operating System, Windows, Linux
Tools : Eclipse, Net Beans.
Servers : Tomcat, Jboss.
Repository Tools : SVN
Web technologies, JSP, Javascript, jquery, CSS3, webservices.
FTP tools, WinScp, putty.
PROJECT PROFILE:
Client: Confidential, NC
Designation: Hadoop, Java Developer
Environment: Spring, JDBC, Hibernate,JSF.
DESCRIPTION:
ASAP is a web-based application, which deals with various kinds of accounts and payments made by customers to parking service providers and municipalities and city transportation administrators. The application provides features like Open and Maintain Accounts, Initiate payments to the agency, Generating reports for the payments and Payment Settlement.At Transportation – Municipal parking agencies, there are around 160 legacy applications. The objective of this project is primarily to understand how customers use their different channels, such as garage parking and off street parking, etc. Now, with big data technology, it increasingly processes and analyzes location based data from its full customer set. The huge amount of log file data, which is in the form of main frame proprietary VSAM format, contains various metrics like CPU usage, memory usage, disk usage. We have written set of multiple Map Reduce jobs to parse the log files and converted the data to csv format. We have loaded the data into HDFS and then into Hive tables where we have written Hive join queries to fetch information from multiple tables to perform various analytics.
RESPONSIBILITIES:
- Designed the application in MVC framework, JDBC abstraction and Hibernate for O/R integration.
- Developed various helper classes needed following Core Java multi-threaded programming and Collection classes.
- Used multithreading in programming to improve overall performance.
- Handled Java multi-threading part in back-end component, one thread will be running for each user, which serves that user.
- Involved in developing complex sql queries in MySQL and Oracle (PL/SQL)
- Optimized system performance by writing stored procedures.
- Designed and developed the front end with JSF, JSP, JavaScript, HTML, JSP, TagLibs and CSS.
- Developed client side validations using jquery and javascript.
- Designed style sheets for for the application interface using CSS3.
- Used AJAX for asynchronous data transfer, to show/hide details in Portfolio screens
- Involved in production support for portfolio, trading and market data applications.
- Used WSAD for writing code for Java, Servlets, JSP and JavaScript.
- Used SVN as the version Control System.
- Involved in scoping the application requirement, creating design, functional specification and development for the new trading system (Equities, Options and Mutual Funds) using J2EE.
- Developed multiple data source based transaction processing logic.
- Involved in integrating Fixed income java application to get fixed income data from third party vendors
- Assisted new developers on development and architecture issues.
- Involved in conducting code review and design review for junior developers Redmine.
- Designed the application in Spring and developed front-end pages with JSPs and JavaScript, using WSAD.
- Implemented parking server data interfaces logic using RESTFUL Web Services. Environment: J2EE, JSF, XML, HTML, JavaScript, CSS3, Informix, Struts2, Hibernate3, JNDI, JDBC, JMS, Servlets, Web Sphere 5.1, Oracle 9i, ANT, Log4j, SVN, Putty, WinSCP3.
- Installed and configured Hadoop on a cluster.
- Experienced in defining job flows using Oozie
- Experienced in managing and reviewing Hadoop log files
- Load and transform large sets of structured, semi structured and unstructured data
- Responsible to manage data coming from different sources and application
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Involved in Unit level testing.
- Involvement in rollout activities i.e. deployment of the packages.
- Prepared design documents and functional documents.
- Submit a detailed report about the daily activities on a weekly basis
Client: Confidential, Irving, TX
Designation: Software Developer
Environment: Sql, Core java, Spring, JSP, JavaScript, Oracle, Html, Jdbc, RabbitMQ, Restlet web services.
DESCRIPTION:
SSP - Strategic Systems Platform is a system launched by Confidential . This system acts as a single integrated platform and is used by Confidential customers to order DSL, DIAL, FIBER, and Video. SSP has the following modules like Business Services, Data Services, Work Flow, Adapters. Business services contain the most of the business logic and functionality of the SSP Application. Data services communicate with DB. Workflow is the Process Integration Engine. It is the combination of a flow controller that controls the flow of the order and status tracker that tracks the status of the order at every point of the flow; finally OS interacts with downstream Systems through adapters.
RESPONSIBILITIES:
- Requirement Analysis.
- Taken care of complete Java multi-threading part in back end components.
- Taken care of multi-threading in back end java beans.
- Implemented consumption of threads using Callable and synchronization api .
- Programmed Restlet web services using Java.
- Produced and consumed xml and Json representations using Restlet api.
- Enabled HTTPS on Restlet and consumed web feeds.
- Worked with jQuery utilities in front-end like jQuery Ajax.
- Performed various activities like message queue consumption and processing, managed error conditions, configuring parameters, routing messages using direct exchanges using RabbitMQ.
- Responsible for writing SQL scripts required for project.
- Devised workarounds to mitigate and fix the orders from falling into various fallout queues using tools.
- Worked on backend databases, performed health checks on production data and monitored post production flow.
- Managed all production related incidents, changes, problems, stability projects, root cause analysis, post problem reviews, customer trends and volumes.
- Provided input into the prioritization of needed improvements.
- Identified critical issues causing repetitive business process interruptions.
- Worked on development IR/CRs on fixing the glitches in production code and data.
- Preparing Design Technical Document’s.
- Coding, code review and documentation.
- Preparing Unit testing documents.
- Handling change requests.
- Online and batch change requests as per the client requirements.
- Completion of service requests with in time.
- Assignment of tasks to the subordinates within the team. It typically involved decision making by getting revised schedule if the estimate has changed due to change in requirement or scope of development.
Client: Confidential, San Ramon, CA
Designation: Java and Hadoop Developer
Environment: Hadoop-Hdfs, Apache Pig, Sqoop, Hive, Map Reduce and Oracle, MySql.
DESCRIPTION:
Confidential is rich web-based application, which deals with various kinds of loans provided to the customer. The application provides features like gathering information's, calculate credit score, grade and score, decision making.
The mining of data was carried out in different fields in USA like tax and finance. Based on the requirements, the data was analysed in various aspects and valuable information was extracted like the customer trends. We receive huge amount of data in the form of XML files from various sources of data from legacy systems. We have loaded the data into HDFS and had written map reduce jobs to convert the XML format into CSV format and loaded the data into Hive after creating the tables. Have written queries to fetch information from multiple tables
RESPONSIBILITIES:
- Involved in scoping the application requirement, creating design, functional specification and development for the new trading system (Equities, Options and Mutual Funds) using J2EE.
- Involved in integrating Fixed income java application to get fixed income data from third party vendors
- Mentored new developers on development and architecture issues.
- Designed the application in Spring MVC and developed front-end pages with JSP and JQuery AJAX tags and JavaScript, using WSAD.
- Implemented trade balance logic using Restful Web Services.
- Taken care of Java multi-threading in common java classes / library.
- Involved in developing complex sql queries in Oracle (PL/SQL)
- Optimized system performance by writing stored procedures.
- Designed and developed the front end with JSP, JavaScript, HTML, TagLibs and CSS.
- Used AJAX for asynchronous data transfer, to show/hide details in Portfolio screens
- Involved in production support for portfolio, trading and market data applications.
- Used WSAD for writing code for Java, Servlets, JSP and JavaScript.
- Used SVN as the version Control System.
- Developed multiple map reduce jobs in java for data cleaning and pre-processing.
- Importing and exporting data into HDFS and Hive using Sqoop
- Experienced in defining job flows
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
- Experienced in managing and reviewing Hadoop log files
- Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS.
- Load and transform large sets of structured, semi structured and unstructured data
- Responsible to manage data coming from different sources
- Supported Map Reduce Programs those are running on the cluster
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Performed unit testing.
Client: Confidential, Chicago, IL
Designation: Software Engineer
Environment: Struts2, Hibernate3, JDBC, Spring, HTML, JSP, JavaScript, Oracle, Servlet, Log4j,J2EE and Oracle, MySql.
DESCRIPTION
Confidential is a web based application deals with the information to the customer for online purchasing of statistical academic reports, analytical summaries, professional development books and media, student achievement metric reports related to the higher education and professional and vocational education across 42 states. The application provides features like tracking issues and requirements raised by the customer, their queries regarding the products.
The project Confidential is to store terabytes of supply chain logistics transactional information generated by the system. These logistic transaction files has to be parsed by set of rules defined in the various xml. Initially these rules were loaded into database and to retrieve those rules was time taking. The solution is based on the open source BigData s/w Hadoop we reduced the time for whole process .These large data is store in that system and extract meaning information out of it. The data will be stored in hadoop file system and processed using Map/Reduce and Pig scripts which includes getting the data from the websites, process the file to obtain the analyzed information in all the logs, extract various reports out of the information and export the information for further processing in order to meet the client requirements with the increasing revenue by delivering comprehensive data advance analytic and decision.
RESPONSIBILITIES:
- Involved in analysis and high-level design for use cases to create/maintain Individuals, user roles and service agreements.
- Implemented custom defined MVC based architecture using JSP, Servlet.
- Involved in the performance tuning of the system using Java and Application server memory management, connection pooling along with operating system features in a Linux environment.
- Worked on all maintenance and enhancement requests for this application.
- Involved in design and development of Dealnavigator which is a very large web based application used for underwriting purpose. Environment: J2EE, JSP, JavaScript, Oracle, spring, JDBC, Servlets, Log4j.
- This application provides features like contact summary, account summary, loan amount, customers credit score, decision making.
- Analysis of Functional Requirement, Business Requirement Document
- Coordination activities with data modeling, architects, requirements team.
- Day-to-Day production support and maintenance for dealnavigator.
- Developed code to interact with database using JDBC API and wrote stored procedures using PL/SQL, Cursors and Triggers to invoke real time database call processing.
- Involved in conducting code review and design review for junior developers using Redmine tools.
- Involved in the performance tuning of the system
- Also involved in mentoring the new team members, code review and design reviews, Environment: Core Java, Spring MVC, Spring Transactions, XML, JSP, JSTL, Oracle, Eclipse, and ANT.
- Analyzing the requirement to setup a cluster,
- Prepared Design documents specifications
- Prepared Unit Test documents and code review documents
- Developed MySQL Query optimization techniques for faster processing and retrieval of data.
- Participating in user requirement sessions to gather business requirements.
- Sharing knowledge with new members within the team.
Client: Confidential
Designation: Java Developer
Environment: Core Java 1.3, JSP, CSS, JavaScript, HTML, Servlets, Oracle 7, JBoss, Edit Plus, Ant, TOAD.
The project handles complete solutions for managing the marketing & sales. The application keeps track of Prospect Database, Installation Base, Sales Leads, Marketing Executives Performance, Company Sales Pipeline etc, Excellent Data Analysis & Flexible Queering facility available.
Responsibilities:
- Analyzing the Functional Specifications and Design Documents
- Involved in Development by applying expertise in J2EE, JSP, HTML, JDBC
- Involved in the presentation layer using JSP, HTML and Java script technologies
- Wrote PL/SQL Queries for backend database interaction.
- Utilized message driven beans for message processing using JMS.
- Utilized Message Driven Beans for Message Processing .