We provide IT Staff Augmentation Services!

Project Lead Resume

3.00/5 (Submit Your Rating)

Phoenix, AZ

PROFESSIONAL SUMMARY:

  • Have 8+ years of experience in design, development, maintenance and support of Java/J2EE applications including Big Data and Hadoop Ecosystem.
  • Working knowledge in multi - tiered distributed environment, OOAD concepts, good understanding of Software Development Lifecycle (SDLC).
  • Experience in working in environments using Agile development and Khanban Support methodologies.
  • Hands on experience with the Hadoop stack (MapReduce, HDFS, Sqoop, Flume, Pig, Hive, HBase, Oozie and Zookeeper).
  • Knowledge on Apache Spark with Scala Environment.
  • Experience in mapR distribution and Horton Works Distribution (HDP).
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems and vice-versa.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF's.
  • Knowledge in configuring and administering the Hadoop Cluster using major Hadoop Distributions like Apache Hadoop and Cloudera.
  • Good understanding of HDFS Designs, Daemons and HDFS high availability (HA).
  • Worked on different OS like UNIX /Linux and developed various shell scripts.
  • Extensive experience in Java/J2EE programming - JDBC, Servlets, JSP, JSTL.
  • Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session Facade, Business Delegate and Data Access Object for building J2EE Applications.
  • Experienced in developing MVC framework based websites using JSF, Struts and Spring.
  • Experience in building web applications using Spring Framework features like MVC (Model View Controller), ORM, relational mapping using Hibernate.
  • Experience in deploying, configuring IBM Web Sphere, BEA Web Logic, Apache Tomcat.
  • Good Knowledge of using IDE Tools like Eclipse, MyEclipse, JBuilder, Rational Application Developer(RAD)and Rational Software Architecture (RSA) for Java/J2EE application development.
  • Expertise in database development usingSQL and PL/SQL in Oracle, DB2 and SQL Server environments.
  • Experience in using ANT and Mavenfor build automation.
  • Experience in designing, developing and implementing E-Commerce, B2B applications using J2EE technologies in Telecom, Banking and Insurance domains.
  • Versatile team player with good communication, analytical, presentation and inter-personal skills.

TECHNICAL PROFICIENCY:

Hadoop Framework: Hadoop, mapR, HDP, HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Zookeeper, Scala, Oozie, Apache Spark

Programming Languages: Java, PL/SQL, Shell Scripts

Java/J2EE Technologies: JDBC, Servlets, JSP, JSF, JMS, EJB

Web Development: HTML, DHTML, XHTML, CSS, Java Script, AJAX

Frameworks: Struts, Hibernate, Spring

XML/Web Services: XML, XSD

Messaging Technologies: JMS

Application/Web Servers: IBM Web Sphere, BEA Web Logic, Apache Tomcat

Methodologies/ Design Patterns: OOAD, OOP, UML, MVC, DAO, Factory pattern, Session Facade

Databases: Oracle, SQL and MySql Server

IDEs: Eclipse, MyEclipse, RAD and RSA

Build Automation: Ant, Maven

Testing and Logging Frameworks: JUnit, Log4J and Slf4J

Version Control Systems: CVS, SVN, Star Team, IBM Clear Case, IBM RTC(Rational Team Concert)

Operating Systems: Windows, Unix, Linux and CentOS

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

Project Lead

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.
  • Setup and benchmarked Hadoop/HBase clusters for internal use.
  • Developed Simple to complex Map/reduce Jobs using Hive and Pig.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
  • Used UDF's to implement business logic in Hadoop
  • Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
  • Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Actively involved in all phases of SDLC process/methodology, right from project initiation and ensured project delivery.
  • Extensively involved in Project Planning, Project Initiation/Project Discovery phase effectively make the plan of the particular project.
  • Created reusable components to enhance process efficiency and minimize impact to application.
  • Diligently Track progress during course of the project, prepare daily reports, status reports and relevant documents that are needed in various phases of the project and communicate it to the leadership
  • Recommended best practices, enhancements to existing process, implementing technological improvements and efficiencies
  • Analyzed current programs including performance, diagnosis and troubleshooting the problem.
  • Co-ordinate release management activities with a team of 8 people.
  • Prepared design and technical specification documentation
  • Extensively used the LOG4j to log regular Debug and Exception statements.

Software: Hadoop, HDFS, MapReduce, Hive, PIG, Java, HBASE, Sqoop, Flume, MySQL.

Confidential, Phoenix, AZ

Technical Lead

Responsibilities:

  • Worked closely with business as a Technical Lead
  • Working closely with Vendor Product team and drive them to deliver the patch in time and help to resolve their technical issues.
  • Responsible for deploying the Patch that delivers from Product Vendor team in all environments.
  • Interact with the client for requirement gathering and was responsible for implementation and integration of Java/J2EE application modules.
  • Responsible for managing Sync. all the environments including Production with update patches that third parte vendor delivering
  • Participating in business discussions to get the update in requirements and responsible for them to implement with help of Vendor group.
  • Implementing shell scripts using UNIX bash to upload files to FTP across all environments

Software: Eclipse, IBM WAS Server, Java6, Apache Axis, SOAP UI

Confidential, Phoenix, AZ

Technical Lead

Responsibilities:

  • Working closely with Client to gather requirements
  • Prioritize the requirements.
  • Acting as technical POC from Confidential
  • Manage the release co-ordination from different teams
  • Resolving all requirements with help of offshore team
  • Prepare and executing internal CMRs to deploy code base in preproduction
  • Actively participated in status calls with client thrice in week

Software: Tortoise SVN:1.6.7, Eclipse /IBM RAD IDE, Java5 & Java6, Sqldeveloper:1.1Jmeter:2.3.2, Apache tomcat server 6.0 & 7.0 server, Apache-servicemix-3.3, Apache-activemq-5.5.1

Confidential, Phoenix, AZ

Technical Lead

Responsibilities:

  • Prioritize the requirements.
  • Analyze the requirements.
  • Define the conceptual data model.
  • Document functional and non-functional requirements.
  • Track and communicate the progress of project activities and deliverables.
  • Participate in Joint Architecture Design Review activities. writing instructions to install new JVM for WAS8 writing scripts to configure Applications on JVM for WAS8
  • Document Solution Architecture and Security Architecture
  • Create Detailed Design Document.
  • Create Assembly Test Conditions.
  • Create Component Test Conditions

Software: J2EE, Core Java, Struts, JSP, JSTL, DHTML, Web sphere DB2, and WAS

Confidential, Minneapolis, MN

IT Analyst

Responsibilities:

  • Participated in coding the business logic.
  • Involved to implementing requirements using Struts and DB2.
  • Involved to implanting front end UIs using Java Swing.
  • Involved in Testing with the Environments E1 and E2.
  • Involver in build and deploy application using python scripts

Software: J2EE, Core Java, Struts, JSP, JSTL, DHTML, Web sphere DB2, and WAS

Confidential, Portland, OR

Analyst Programmer

Responsibilities:

  • Participated in coding the business logic in Enhancement Requests.
  • Analyzing ERs that are assigned to Recovery module and acting single point of contact for recovery module for implementing ERs.
  • Responsible for preparation of test plans and component testing.
  • Involved in debugging of issues.
  • Component testing using test plans and support for integration, performance and e2e testing.
  • Preparation of Test cases and E2E testing in Development and QA server.

Software: J2EE, Core Java, JSP, JSTL, DHTML, DB2, IBM WAS, RSA, IBM CC, IBM CQ, IBM RTC

We'd love your feedback!