Project Lead - Big Data Developer Resume
Boston, MA
PROFESSIONAL SUMMARY:
- 12.3 years of IT industry experience in full life cycle of software development process including Requirement Analysis, Design, Development, Testing, Implementation and Maintenance primarily in Java/J2EE technologies.
- Being associated with National Informatics Centre, worked as a Research Fellow for Strategic clients, associated with one of the leading Investment Banking of North America, one of the leading automotive of North America being part of Confidential.
- Proficient in Map Reduce, Hive, Impala, YARN, SQOOP, OOZIE and Core java concepts like Threads, Exception handling, Generics & Collections, Strings etc.
- Extensively used Java/J2EE design patterns for Object - Oriented Analysis and Design.
- Experience in developing applications using frameworks like Struts 1.2, Hibernate
- Acquainted with Java Web Services Restful services in Cloud as IAAS and PAAS.
- Used Hibernate and JDBC to connect to databases like Oracle & MYSQL.
- Experience in unit testing the applications using JUnit Cloud Framework.
- Successfully implemented and compiled many of the applications in java using different IDEs like Eclipse, RAD.
- Used Web Sphere Application Server, Apache Tomcat to deploy many applications.
- Have good hands on in fixing performance issues and stabilizing Java applications/products.
- Used ANT to build and deploy J2EE Applications.
- Used GIT Sync and Build Forge for deploying of J2EE Applications.
- Worked in both Windows and Linux environments.
- Acquainted with project & quality management and have good exposure to planning, estimation etc.,
- Have experience in managing a team with 10+ members.
- Have experience working in both Agile and Waterfall software development methodologies.
- Technically focused and self motivated professional with strong analytical and communication skills.
- Have good hold on design & analysis of algorithms and data structures, good understanding of data mining concepts and in depth knowledge of operating systems.
TECHNICAL SKILLS:
Hadoop Distribution: Cloudera (CDH4&5)
Big data Ecosystem: MapReduce, Hive, Impala, YARN, Sqoop, Oozie
Languages/ Technologies: JAVA/J2EE, JDBC, Java Web Services, SQL, C, C++
Frame Works: Swing, Struts 1.2, Hibernate, Cloud
Process Automation Tool: ANT, Junit, GIT, Jenkins
Data Bases: Oracle, DB2, SQL Server
Web/App Servers: IBM Websphere, Apache Tomcat
IDE s /Tools: Eclipse, RAD, Clear Case.
Functional/Domain Knowledge: Java/J2EE Design Patterns in object oriented analysis and design, Manufacturing, Investment Banking, Project
PROFESSIONAL EXPERIENCE:
Project Lead - Big Data Developer
Confidential, Boston, MA
Responsibilities:
- Interacted with Business team and gather requirement
- Implemented robust data pipeline using oozie, sqoop, hive and impala for and external client operational reporting.
- Extensively involved in Design phase and delivered Design documents.
- Developed oozie Workflows for data ingestion using sqoop imports into Hive tables using hcatlog.
- Worked on several compression technique (snappy, zip) and standard file format (parquet, orc) to achieve optimum performance in Tableau reporting.
- Collected the performance metrics from impala & tableau side, to analyze end to end performance and to setup benchmark while using several file formats, compression codec and demoralize mart tables.
- Worked on configuring the performance tuning parameters used during the benchmark.
- Proficient in data modeling with Hive partitioning, bucketing, and other optimization techniques in Hive to design warehousing infrastructure on top of HDFS data with Hive.
- Build custom wrapper shell scripts called Java API to check data availability on oracle side and then invoke corresponding workflows. Good knowledge on executing Spark SQL queries against data in Hive.
- Set up Autosys scheduler to scheduled Jobs to be triggered on a specific day and time.
- Worked on various POCs for performance optimizations like using distributed cache for small datasets, partition, bucketing in Hive and Map Side joins when writing Map Reduce jobs.
Project Lead
Confidential, MA
Responsibilities:
- Interacted with Business team and gather requirement
- Worked on C - Real Time Agent where the message transferred from Mainframe system as packet services which handles though extracting process
- Developed Shell scripts to automate file manipulation and data loading procedures.
- Developed Shell scripts for Ftp. Sftp and NDM the files from client location to GPA system.
- Involved in daily agile Stand up call.
- Involved in Agile Backlog grooming and planning session to create the stories in RTC,
- Involved in Agile Retrospective meeting.
- Involved in Agile Demo session and gave demo for the completed stories for that Sprint.
- Explaining the SQA about the Stories and facilitating them to test the stories.
- Worked in End to end deployment to production and supporting in PROD whenever needed.
Confidential, MA
Java/J2EE Developer
Responsibilities:
- Involved in complete software development life cycle with object oriented approach adhering to client’s business process and continuous client feedback.
- Redesigned GPA and Migrating the Web Sphere application to Tomcat Server application.
- Migrating all the P&A components to higher version for Oracle 12c EXADATA
- Prepare the Test plan and Unit testing of the developed/modified Programs
- BUAT Setup for all the components in P&A and GPA.
- UAT/BUAT Support
- Used GIT/CLEARCASE for project management and ANT build Scripting .
- Involved in knowledge transition to all the stake holders and L2 production support team.
Environment: Java/J2EE, Servlets, JSP, Shell Script, AUTOSYS, IBM RAD6.0, Junit, Oracle 10g, IBM Web sphere application server, Windows.
Confidential, MA
Java/J2EE Developer
Responsibilities:
- Worked on designing and developing an application in agile development environment.
- Understanding the existing architecture of Performance and Analytics, and implementing it in the Cloud Framework.
- M-TIER has re-engineering to IDF Service Interfaces
- Migrating all the existing system calls into Service Calls using XML over HTTP with IDF Web Services, and all inline queries into database encapsulated Stored procedures
- Implementing Standard Logging Function based on Log4j.
- Communicating with ADOBE Flex v4.5 User Interface through the XML
- Developing a new connector module to facilitate the connection (Socket/HTTP) between Middle Tier, Result Manager and Warehouse Manager
- Build script for running the application in the Cloud environment
Environment: Java/J2EE, CDT, Cloud - Restful, J-Unit, JDBC, Oracle 10g, Windows.
Confidential, OR
Java/J2EE Developer
Responsibilities:
- Requirement Analysis of new Enhancement for the SCARF Phase II
- Code Defects and Fixed on/before the stipulated time and on the fly fixed.
- Coding based on Model View Controller pattern to validate data against business rule and retrieve data from DB
- Mentored new joiners and conducted technical sessions and knowledge transitions.
- Interacted with the onsite/offshore counterparts to understand the requirements /issues
- Report generation in PDF Format and Excel Format
- Application customization as per client requirements
Environment: Java/J2EE, Spring 2.5, Hibernate 2.5, Struts 1.2, JMS, JavaScript, Apache Maven, Rational clear Case, Eclipse, Apache Tomcat, XML, WSDL, SOAP, Oracle, Windows.
Confidential
Java/J2EE Developer
Responsibilities:
- Understand the business requirement and characteristics of the intended solution.
- Responsible for analysis of the Change Requests, problem identification and Fixing the root-causes, Bug Fixing, Debugging & Code review
- Parametric Modelling of Solids such as Solid Slab, Solid Loft, Solid Sweep
- RMI Concepts has been implemented to access server side object from remote areas.
- JNI has been used to retrieve the native application (Open Cascade) and libraries which is written in CPP.
Environment: Java /J2EE, Core Java, Swing, RMI, JNI, JDBC, XML, CVS, VI Editor Linux, PostgreSQL, Open Cascade, XML, MYSQL.