Sr. Java / Hadoop Developer Resume
Rochester, MN
SUMMARY
- An information technology professional having 8+ years of Industry Experience as a Big Data Technical Consultant.
- Expertise in JEE & J2EE technologies such as Servlets, JSP, EJB, JNDI, JPA, JTA, JMS, JDBC, JAXP, JAXB, and XML.
- Extensively worked on n - tier architecture systems with application system development using Java, JDBC, Servlets, JSP, EJB, JMS, Web services, spring, Struts MVC, JSF, Hibernate 4.0, XML, JAXP and JAXB.
- Excellent skills in Analysis, Design & Development of J2EE application and built on MVC architecture.
- Developing Multi-Tier Web Applications and Server side Business Logic using J2EE, XML, WebSphere, WebLogic, Apache Tomcat, Enterprise Java Beans, Servlets, JSP, Struts, JDBC, DB2, Oracle, PL/SQL.
- In depth understanding/knowledge of Hadoop Architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce.
- Experienced in Waterfall & Agile development methodology.
- Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive and Pig
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa
- Experienced in extending Hive and Pig core functionality by writing custom UDFs using Java.
- Experience with developing large-scale distributed applications.
- Good understanding of Data Mining and Machine Learning techniques
- Experienced in NoSQL databases such as HBase, and MongoDB
- Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
- Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig
- Experienced in developing applications using all Java/J2EE technologies like Servlets, JSP, EJB, JDBC, JNDI, JMS etc.
- Experienced in developing applications using HIBERNATE (Object/Relational mapping framework).
- Experienced in developing Web Services using JAX-RPC, JAXP, SOAP and WSDL. Also knowledgeable in using WSIF (Web Services Invocation Framework) API.
- Thorough knowledge and experience of XML technologies (DOM, SAX parsers), and extensive experience with XPath, XML schema, DTD's, XSLT, XML SPY, MAPForce editor.
- Experience in Message based systems using JMS, TIBCO & MQSeries.
- Experience in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages and Cursors for Oracle, SQL Server, DB2 and Sybase.
- Proficient in writing build scripts using Ant & Maven.
- Experienced in using CVS, SVN and Sharepoint as version manager.
- Proficient in unit testing teh application using Junit, MRUnit and logging teh application using Log4J.
TECHNICAL SKILLS
BigData: Hadoop/Big Data HDFS, MapReduce, HBase, Pig, Hive, Sqoop, Flume, MongoDB, Cassandra, Power pivot, Puppet, Oozie, Zookeeper
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans, IDE’s Eclipse
Big data Analytics: Datameer 2.0.5
Frameworks: Struts, Hibernate, Spring
Programming languages: C, C++, Java, Python, Ant scripts, Linux shell scripts
Databases: Oracle 11g, MySQL, DB2, MS-SQL Server
Web Servers: Web Logic, Web Sphere, Apache Tomcat
Web Technologies: HTML5, CSS3, XML, JavaScript, AJAX, SOAP, WSDL
PROFESSIONAL EXPERIENCE
Confidential, Rochester, MN
Sr. JAVA / Hadoop Developer
Responsibilities:
- Workings on bigdata infrastructure build out for batch processing as well as real-time processing.
- Developed, Installed and configured Hive, Hadoop, Bigdata,hue, oozie, pig, sqoop, Storm, Kafka, Elastic Search, Redis, Java, J2EE, HDFS, XML, PHP, ZooKeeper, Flume and Oozie on teh Hadoop cluster.
- Designed and Developed SOAP and REST Interface with Java
- Developed REST services to talk with adapter classes and exposed them to teh angular js front-end.
- Managed thousands of MongoDB (NoSQL) databases totaling 50+ TBs.
- Developed enhancements to MongoDB architecture to improve performance and scalability.
- Collaborated with development teams to define and apply best practices for using MongoDB.
- Worked on Hadoop, Hive, Oozie, and MySQL customization for batch data platform setup.
- Worked on implementation of a log producer in SCALA that watches for application logs, transforms incremental logs and sends them to a Kafka and Zookeeper based log collection platform.
- Hands on experience with MVC JavaScript frameworks such as Backbone.js, Angular.js and Node.js.
- Implemented a data export application to fetch processed data from these platforms to consuming application databases in a scalable manner.
- Involved in Various Stages of Software Development Life Cycle (SDLC) deliverables of teh project using teh AGILE Software development methodology.
- Exposure to Visual Basic, C++, and Groovy
- Extensive experience in J2EE, C++,C,JavaServlets,JavaSwing, AWT, JSPs, XML/XSL, DHTML, Oracle, JDBC, UNIX and MS-Windows NT/98/2k.
- Experienced in Multithreading programming in C++, C#.
- Gathered and clarified requirements with business analyst to feed into high-level customization design, development and installation phases.
- Experience Working onSelenium, QC, Rally, QTP, LoadRunner, JMeter, Fiddler, SOAP UI, REST/SOAP testingand API testing.
- Designed teh Architecture of teh project as per Spring MVC Framework.
- Designed and developed backend application servers using Python.
- Created custom user defined functions in Python language for Pig.
- Developed Python Mapper and Reducer scripts and implemented them using Hadoop streaming.
- Deployed production site using Apache 2.0 with mod python.
- Worked with Spring Core, Spring AOP, Spring Integration Framework with JDBC.
- Developed user interface using HTML, JSP, Servlets, CSS, JavaScript and Ajax.
- Exclusively used CSS for modifying Layout and design of teh web pages.
- Developed Data Access Objects (DAO's) for easy data access.
- Developed GUI HTML, XHTML, AJAX, CSS 5 and JavaScript (jQuery).
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Participate in requirement gathering and analysis phase of teh project in documenting teh business requirements by conducting workshops/meetings with various business users.
- POC work is going on using Spark and Kafka for real time processing.
- Developed a data pipeline using Kafka and Storm to store data into HDFS.
- Populated HDFS and Cassandra with huge amounts of data using Apache Kafka.
- POC work is going on comparing teh Cassandra and HBase NoSQL databases.
- Worked with NoSQL databases like Cassandra and Mongo DB for POC purpose.
- Implement POC with Hadoop. Extract data with Spark into HDFS.
Environment: MapReduce, HDFS, Hive, Pig, Hue, Oozie, Solr, Bigdata, Core Java, Eclipse, Hbase, Flume, Spark, Scala, Kafka, Cloudera Manager, Cassandra, Python, Greenplum DB, IDMS, VSAM, SQL*PLUS, Toad, Putty, Windows NT, UNIX Shell Scripting, Pentaho, Talend, Big data, YARN, Mongo DB
Confidential, NYC, NY
Sr. JAVA / Hadoop Developer
Responsibilities:
- Devised and lead teh implementation of teh next generation architecture for more efficient data ingestion and processing.
- Developed many modules & functionalities using JSP, Spring MVC, Spring IOC, Spring AOP, Spring Validation & Hibernate.
- Developed theJ2EEapplication based on teh Service Oriented Architecture.
- Strong Experience in Front End Technologies like JSP, HTML, JQuery, JavaScript, CSS.
- Proficiency with mentoring and on-boarding new engineers who are not proficient in Hadoop and getting them up to speed quickly.
- Proficiency with modern natural language processing and general machine learning techniques and approaches.
- Architecture& Designed teh Restful web services and developed core component layers like xml validation, core service layer, solr search and transformation components.
- Responsible for business logic usingjavaand JavaScript, JDBC for querying database.
- Extensive experience with Hadoop and HBase, including multiple public presentations about these technologies.
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into teh Hadoop Distributed File System and PIG to pre-process teh data.
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems and suggested some solution translation via lambda architecture.
- Created custom user defined functions inPythonlanguage forPig.
- Managed application deployment using Python.
- Upgraded Python 2.3 to Python 2.5, this required recompiling mod python to use Python 2.5.
- Worked on NoSQL databases including HBase, Mongo DB, and Cassandra.
- Created and injected spring services, spring controllers and DAOs to achieve dependency injection and to wire objects of business classes.
- Used spring Inheritance to develop beans from already developed parent beans.
- Worked on spring Quartz functionality for scheduling tasks such as generating monthly reports for customers and sending those mails about different policies.
- Used DAO pattern to fetch data from database using Hibernate to carry out various database.
- Used Hibernate Transaction Management, Hibernate Batch Transactions, and cache concepts.
- Modified teh Spring Controllers and Services classes so as to support teh introduction of spring framework.
- Developed various generic JavaScript functions used for validations.
- Developed screens using jQuery, JSP, JavaScript, AJAX and ExtJS.
- Developed various generic JavaScript functions used for validations.
- Developed screens using HTML5, CSS, JavaScript, JQuery and AJAX.
- Used Rational ApplicationDeveloper(RAD) which is based on Eclipse, to develop and debug application code.
- Performed server provisioning through template based clones, (VMware).
- Performed virtual machine configuration and performance monitoring using VMware Virtual Center Console 5.0.
- Created user-friendly GUI interface and Web pages using HTML, AngularJS, JQuery andJavaScript.
- Used Log4j utility to generate run-time logs.
- Involved in a full life cycle Object Oriented application development - Object Modeling, Database Mapping, GUI Design
- Developed Functional Requirement Document based on users' requirement.
- Introduced data ingestion using spark streaming for page Views by reading an event stream over HTTP, and groups it into 1-second intervals. It then transforms teh event stream to get a D-Stream of (URL, 1) pairs called ones, and performs a running count of these using teh runningReduce operator.
Environment: Hadoop, Linux, CDH4, MapReduce, HDFS, Hive, Pig, Shell Scripting, Sqoop, Java 7, NoSQL, Eclipse, Oracle 11g, Maven, Log4j, Mockito, Git, ATG ecommerce, Spring, Apache Kafka, Apache Spark, Logstash, ElasticSearch, solr.
Confidential, Chicago, IL
JAVA / Hadoop Developer
Responsibilities:
- Experience Working onSelenium, QC, Rally, QTP, LoadRunner, JMeter, Fiddler, SOAP UI, REST/SOAP testingand API testing
- Expertise in MVC Architecture using JSF and Struts framework and implementing custom tag libraries.
- Good experience in creating and consuming Restful Web Services.
- Developed teh application using Struts Framework which is based on teh MVC design pattern.
- Deployed teh application on Weblogic Application Server cluster on Solaris environment.
- Deployed EJB Components on WebLogic.
- Creation of REST Web Services for teh management of data using Apache CXF.
- Development of AJAX toolkit based applications using JSON.
- Developed additional UI Components using JSF and implemented an asynchronous, AJAX (JQuery) based rich client to improve customer experience.
- Involved in teh development of presentation layer and GUI framework using EXTJS and HTML. Client Side validations were done using JavaScript
- Involved in adding AJAX, JavaScript components to some of teh JSP pages wherever needed.
- Developed user interface using JSP, AJAX, JSP Tag libraries and Struts Tag Libraries to simplify teh complexities of teh application.
- Developed user interface using JSP, JSTL and Custom Tag Libraries and AJAX to speed teh application.
- Developed Servlets and JSPs based on MVC pattern using Struts framework and Spring Framework.
- Worked on Data Services implementation for teh CRUD services.
- Developed teh UML Use Cases, Activity, Sequence and Class diagrams using Rational Rose.
- Developed Oracle PL/SQL Stored Procedures and Queries for Payment release process and authorization process.
- Developed programs for accessing teh database using JDBC thin driver to execute queries, Prepared statements, Stored Procedures and to manipulate teh data in teh database.
- Involved in debugging teh product using Eclipse and JIRA Bug Tracking.
- Involved in JUnit Testing of various modules by generating teh Test Cases.
- Configured Maven dependencies for application building processes.
- Developed XSD for validation of XML request coming in from Web Service.
- Responsible to manage data coming from different sources.
- Supported Map Reduce Programs those are running on teh cluster.
- Involved in loading data from UNIX file system to HDFS.
- Involved in creating Hive tables, loading with data and writing hive queries, which will run internally in map, reduce way. Involved in designing a production process for extracting teh final data and forwarding it to end users on an as-needed basis (me.e. daily, weekly, monthly, quarterly, and yearly)
- Setup and benchmarked Hadoop/HBase clusters for internal use
Environment: Java 7, Eclipse, Teradata, SVN, Hadoop, Hive, Pig, HBase, Unix, Map Reduce, HDFS, Sqoop, Hadoop Map Reduce, Windows 7, UNIX Shell Scripting, Cassandra
Confidential
Hadoop/ Java Developer
Responsibilities:
- Worked on analyzing, writing Hadoop Mapreduce jobs using Java API, Pig and Hive.
- UsedJava, HTML, JDBC, JSP, Ant, JUnit, XML, JavaScript, and a proprietary Struts-like system.
- Developed on Tomcat for a WebLogic deployment tools included Ant, JUnit, DBUnit, HttpUnit, Visual Source Safe, and Scarab.
- Development of frontend (Client side) using JSP, CSS, JQuery, JavaScript
- Development of backend (Server side) using CoreJava, andJavaEE
- Persisted data from database, leveraging Hibernate and SQL Server 2008
- Used Spring Core for middle tier development to achieve inversion of control
- Created database Objects like tables, Views and Indexes in Oracle Database
- Responsible for creating issuances and deployments into development, implementation and production servers.
- Designed and developed backend application servers usingPython.
- Created custom user defined functions inPythonlanguage forPig.
- Managed application deployment using Python.
- Upgraded Python 2.3 to Python 2.5, this required recompiling mod python to use Python 2.5.
- Participated in database design/analysis and designed ER diagrams
- Designed and implemented user interfaces
- Followed MVC Structure and used AngularJS to develop Single page Application.
- Responsible for tracking teh issues to get them resolved by interacting with customers and various other teams involved
- Implemented business logic at Server side in CoreJava, andJavaEE Architecture.
- Responsible for building scalable distributed data solutions using Hadoop.
- Involved in loading data from edge node to HDFS using shell scripting.
- Worked on installing cluster, commissioning & decommissioning of datanode, namenode high availability, capacity planning, and slots configuration.
- Created HBase tables to store variable data formats of PII data coming from different portfolios.
- Implemented a script to transmit sysprin information from Oracle to Hbase using Sqoop.
- Implemented test scripts to support test driven development and continuous integration.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Load and transform large sets of structured, semi structured and unstructured data
- Experience in managing and reviewing Hadoop log files.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
- Installed Oozie workflow engine to run multiple Hive and pig jobs.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.
Confidential
J2EE Developer
Responsibilities:
- Successfully completed teh Architecture, Detailed Design & Development of modules Interacted with end users to gather, analyze, and implement teh project.
- Developed applications that enable teh public to review teh Inventory Management.
- Established schedule and resource requirements by planning, analyzing and documenting development effort to include time lines, risks, test requirements and performance targets
- Analyzing System Requirements and preparing System Design document.
- Developing dynamic User Interface with HTML and JavaScript using JSP and Servlet Technology.
- Designed and developed a sub system where Java Messaging Service (JMS) applications are developed to communicate with MQ in data exchange between different systems.
- Java Message Oriented Middleware (MOM) API for sending messages between clients
- Used JMS elements for sending and receiving messages.
- Used hibernate for mapping from Java classes to database tables
- Wrote PL/SQL & SQL in Oracle Database for creating tables, indexes, triggers and query statements
- Design and develop enterprise web applications, for internal production support group, using Java (J2EE), Design Patterns and Struts framework.
- Tuning and Index creation for improved performance.
- Designed and developed database schema for new applications.
- Involved in various phases of Software Development Life Cycle (SDLC) of teh application like Requirement gathering, Design, Analysis and Code development.
- Developed a prototype of teh application and demonstrated to business users to verify teh application functionality.
- Design, develop and implement MVC Pattern based Keyword Driven automation testing framework utilizing Java, JUnit and Selenium WebDriver.
- Used automated scripts and performed functionality testing during teh various phases of teh application development using Selenium.
- Prepared user documentation with screenshots for UAT (User Acceptance testing).
- Developed and implemented teh MVC Architectural Pattern using Struts Framework including JSP, Servlets, EJB, Form Bean and Action classes.
- Implemented server side tasks using Servlets and XML.
- Helped developed page templates using Struts Tiles framework.
- Implemented Struts Validation Framework for Server side validation.
- Developed JSP's with Custom Tag Libraries for control of teh business processes in teh middle-tier and was involved in their integration.
- Implemented Struts Action classes using Struts controller component.
- Developed Web services (SOAP) through WSDL in Apache Axis to interact with other components.
Environment: Rational Application Developer 6.0, Rational Rose, Java, J2ee, JDBC, EJB, JSP, EL, JSTL, JUNIT, PMD Tool, XML, SOAP, WSDL, SOA, Web-logic and Sun-Solaris