Sr. Java Developer Resume Profile
VA
Summary:
- Over 8 years of experience in software development, 2 years of experience in all phases of Hadoop and HDFS development,
- Expertise in Batch processing ETL Data Warehousing and Real-time solutions with different MPP sources to Hadoop.
- Excellent experience with Map reduce programming
- Experience in Installing, Configuring, Maintaining Hadoop cluster with all the components and eco-System involved in Hadoop ETL process
- Hands on Experience with Hadoop eco-System Mapreduce, Hive, Pig, Sqoop, Flume, Hbase
- Exercised several ETL Projects and successfully migrated to Hadoop
- Extensive experience in designing and developing enterprise applications using J2EE technologies such as Servlet, JSP, JSTL, Ajax, MVC architecture Struts, JSF and Spring, Spring Framework , RMI , EJB , JDBC, Hibernate and Web Services .
- Strong experience in the functional usage and deployment of applications over Web/Application servers like WebSphere, WebLogic, JBoss and Tomcat.
- Expertise in the design and development of applications using Struts and JSF -Model View Controller Framework.
- Experience in SOA Service Oriented Architecture and Web Services technologies Apache AXIS, SOAP, WSDL, UDDI .
- Experience in implementing Core Java J2EE design patterns like Singleton, Factory Pattern, Service locator, Business Delegate, DAO, Session Fa ade, etc.
- Experience working on XML, XSLT, XSL-FO, HTML, DHTML, JavaScript, CSS and Ajax.
- Experience in RDBMS using Oracle, Sybase and SQL Server and comfortable using databases DB2, MySql and MS Access and strong experience in back end development using SQL, PL/SQL and Stored Procedures.
- Experience developing applications using IDE's like Eclipse, IBM WSAD and RAD, Weblogic WorkShop and JDeveloper.
- Experience in defining job flows and scheduling using Autosys
- Experience in analyzing data with Hive, Pig and Hadoop Streaming
- Developed User Defined Functions for Hive and Pig.
- Experience in loading log data into HDFS using scripts, Sqoop , Flume , Named pipes
- Experience in providing support to data analyst in running Pig and Hive queries.
- Experience in Databases Teradata , MySQL , Neteeza .
- Design and Develop workflows and automate Jobs using Oozie.
- Design and Develop Oozie Re-run capability using shell scripting.
- Experienced in developing the coordination Job flows using Oozie
- Sqoop for importing and exporting the data from HDFS to Relational Database systems
- Importing the data from Relational Database systems to Hive tables and Vice Versa
- Experienced in writing Shell and ANT scripts for builds and deployments to different environments. Network File system, FTP services, Mail services
- Backing up and Restoring the logs and Databases.
- Experience in Installing, Configuring, Maintaining Datameer and Integrating with Hadoop Cluster.
- Design and development of User Interfaces using JSP, J2EE, XML, XSL, XSLT, HTML, DHTML and CSS.
- Used SAX and DCOM parsers to traverse the XMLHTTP Response.
- Experienced with full suite of infrastructure services DHCP, PXE, DNS, KICKSTART,and NFS .
Technical Skills
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Confidential
Sr. Java Developer
Responsibilities:
- Responsible as Hadoop Engineer to migrate Legacy Retail applications ETL to Hadoop.
- Developed automated scripts for all jobs Starting from pulling the data from Mainframes to Loading to Mid-Range Systems Linux and Vice Versa using NDM tool and FTP.
- Developed and Designed ETL Applications and Automated using Oozie workflows and Shell scripts with error handling and mailing Systems.
- Designed and built Hadoop solutions for big data problems.
- Worked on setting up pig, Hive and Hbase on multiple nodes and developed using Pig, Hive and Hbase, MapReduce.
- Developed MapReduce application using Hadoop, MapReduce programming and Hbase.
- Implemented Map-reduce and Hive solutions for converting different file-formats like Fixed Block and EBCDIC data
- Used Compression Techniques snappy with file formats to leverage the storage in HDFS
- Involved in solution architecture team to verify which hadoop ecosystem tool fits for different application.
- Successfully migrated Legacy application to Big Data application using Hive/Pig in Production level.
- Writing Java UDF in hive and PIG, which are not present in the hadoop stack.
- Deciding and Writing Map-Reduce java programs where ever its applicable.
- Hands on work experience in Pig.
- Tested Pig versus Hive for better performance to use right tool in the use case
- Experience in back feeding data to Teradata using Sqoop.
- Daily MySQL backup with Disaster Recovery Sync.
- Designed Oozie templates Sub-Workflows to Leverage development work
- Implemented Oozie Re-run capability.
- Experienced in working with Flume and JSON file format.
- Automated deployment scripts while deploying to higher environments.
Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML, Javascript, Hibernate
Confidential
Sr. Java Developer/Hadoop Developer
Responsibilities:
- Requirements gathering, co-ordination with business users.
- Involved in various phases of SDLC such as requirements gathering, modeling, analysis, and design.
- Integrated Web Technologies such as JSP, JSF and AJAX.
- Server Side Technologies such as Spring are used as framework to integrate components into system.
- Defined Hibernate to communicate with the database.
- Usage of Web Services like XML.
- Implemented AJAX for developing asynchronous web applications on client side.
- Supported the release Management team in the automation of the JAVA release process scripting.
- Implementing JAXB API makes it easier to access XML files from applications in JAVA.
- Involved in monitoring applications on WebSphere Application Server.
- Deployed WebSphere server for various components of application.
- Information management using BladeLogic server automation and integrating into J2EE environments.
- Implemented data access using Hibernate persistence framework
- Developed the configuration files and the class's specific to the spring and hibernate
- Utilized Spring framework for bean wiring Dependency injection principles
- Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML, Hibernate, Struts, Struts2, JDBC, and JavaScript development.
- Utilized J2EE Architecture, MVC Architecture, Design Patterns.
- Design of GUI using Model View Architecture STRUTS Frame Work .
- Integrated Spring DAO for data access using Hibernate
- Created hibernate mapping files to map POJO to DB tables
- Involved in the Development of Spring Framework Controllers
- Performed unit testing for all the components using JUnit
- Designed and developed the XSD for WSDL
- Developed user interface using JSP, JSP Tag libraries JSTL, HTML, CSS, JavaScript to simplify the complexities of the application
- Configured and deployed applications into test and production environments.
Environment: JDK 1.5, Websphere Application Server 6.1, JSP 2.0, Hibernate 3, Spring 2.5, IBM DB2, Web Services using XML, JAXB, AJAX, BladeLogic Server.
Confidential
Java/J2EE Developer
Responsibilities:
- Implemented the application using Struts Framework which is based on Model View Controller design pattern.
- Used the Struts validation and Tiles Framework in the presentation layer.
- Developed user interface using JSP, AJAX, JSP Tag libraries and Struts Tag Libraries to simplify the complexities of the application.
- Used EJBs in the application and developed Session beans to house business logic at the middle tier level.
- Involved in writing PL/SQL stored procedures using PL/SQL Developer.
- Tested, debugged and implemented the application using JUnit for unit testing.
- Developed presentation layer using HTML, CSS, JSP, JSTL, Struts Taglibs, JavaScript and AJAX
- Designing the presentation tier using the Struts framework and Command pattern for the middle tier
- The Application was developed using Struts MVC Framework integrated with Hibernate
- Used Struts Validator framework for server side and client side validations.
- Used Hibernate as Object relational mapping tool for mapping Java Objects to database tables
- Used Struts Tiles framework for the development of user interface.
- Developed web pages to display the account transactions and details pertaining to that account using DHTML, JSF and CSS.
- Configured Spring to manage Actions as beans and set their dependencies in a Spring context file and integrated middle tier with Struts web layer
- Wrote JavaScript validations to validate the fields of the user registration screen and login screen.
- Designed and developed JSF components, implemented event handling using Java, JSF, AJAX, and JavaScript for various modules.
- Used log4j for logging mechanism.
- Configured JDBC and LDAP Security in the application server.
- Developed LDAP server configuration files to enable encryption support for password storage.
- Implemented JSF Converters to handle formatting and Localization and configured the faces-config.xml to create a web application navigation rule.
Environment: Java, J2EE, JSP, Struts 2.0, Servlets 2.3, HTML, XML/XSL, XSLT, AJAX, JavaScript, DB2, Eclipse IDE, TestNG, ANT, Windows, Subversion Version Control .
Confidential
Java Developer
Responsibilities:
- Developed the application using Hibernate and Spring Framework.
- Developed presentation layer using Spring MVC and used annotation based mapping to map the JSP post backs to the controller methods.
- Used Spring AOP for Security and Transaction Management.
- Developed Rich user interface using HTML, JSP, JSTL, Java Script, JQuery and CSS.
- Configured Web Logic Application server and deployed the web components into the server.
- Used Oracle as Back end database.
- Used Hibernate 3.2 to communicate with the Database, mapping the entities to the tables and corresponding columns in the database by using Hibernate annotations.
- Developed HQL queries to implement the select, insert, update and delete operations to the database by creating HQL named queries.
- Used SQL Developer to query, update and monitor oracle database.
- Wrote SQL statements to create and update database tables.
- Improved the performance by using Connection Pooling to maintain the database connections within the session and reduce the number of transactions with the database.
- Involved in performance tuning by reviewing the code to reduce the number of database calls and tuning the SQL and HQL queries to get optimized performance.
- Used Maven to build and deploy the application.
- Configured and Administered IBM Rational Clear Case Version Control to automate Code Access and Code Release Management.
- Used IBM Rational Clear Quest to keep track of the defects, activities, DBCRs Database change requests etc.
- HP Quality Center is used to keep track of the defects and maintain the defect status.
- Used PUTTY for UNIX login, to run the batch jobs and check the server logs.
- Involved in performing Integration testing of the whole application.
- Used JUnit with JMock for unit testing
Environment: Java 1.5, Spring 3.0, Hibernate 3.2, Oracle 10G, JSP, JSTL, XML, HTML5, CSS, Java Script, Web Logic Application Server, Eclipse 3.0, JUnit, Velocity, Firebug, JQuery, AJAX, HP Quality Center 9.2.