We provide IT Staff Augmentation Services!

Technical Lead /hadoop Developer Resume

5.00/5 (Submit Your Rating)

New, YorK

SUMMARY

  • 14 years of experience in end - to-end product development life cycle at Confidential .
  • 2 issued patents, 4 patent applications, 3 published disclosures
  • Experience in Big data ecosystem related technologies for 3 years and over 5 years of experience in Java related technologies.
  • Cloud computing technical evangelist promoting Confidential technologies through articles, speaking in conferences etc.
  • Excellent understanding / knowledge of Hadoop architecture and MapReduce programming paradigm,PIG,HIVE Oozie,YARN
  • Proficient in Installation, Configuration of Hadoop, HDFS, Sqoop.
  • Experience in import/export of data using Sqoop from HDFS/Relational Database Systems and vice-versa.
  • Experience in Hadoop/HDFS commands, writing Java MapReduce Programs, verifying managing and reviewing Hadoop Log files.
  • Experience on major hadoop distributions from Cloudera, HortonWorks.
  • Good experience in design and development of various enterprise applicationsusing J2EE technologies like JDBC, JMS, JNDI, XML, and Web Services.
  • Experienced in building OSGI Plug-in based applications.
  • Experience in creating RESTful(JAX-RS) Web Services and in using XML, SOAP, WSDL and UDDI in JAVA
  • Experienced in scripting languages like Perl, Python, shell script and MS-DOS
  • Good knowledge in Configuration Management tools like Rational ClearCase, Confidential CMVC.
  • Experience in programming languages like C, C++, Java, Visual C++ and Windows SDK.
  • Experience in using Oracle, Sybase databases and writing complex SQL queries.
  • Comfortable with troubleshooting UNIX(AIX and Linux) and Windows systems.
  • Strong team player, ability to work independently and in a team as well, ability to adapt to a rapidly changing environment, commitment towards learning.

TECHNICAL SKILLS

PROGRAMMING LANGUAGE: Java, C, C++

OPERATING SYSTEMS: Windows, AIX

RDBMS: Oracle, Sybase, DB2

SCRIPTING LANGUAGES: Perl, shell scripts, Python

GUI DEVELOPMENT: Win32 SDK, VC++, Eclipse, .Net, Java

TECHNOLOGY: Hadoop, Sqoop, Hive, XML, J2EE/WebServices (JAX-WS/RS(REST)), Systems management, Virtualization, Cloud Computing, Host Integration, Microsoft Windows Systems Programming.

APPLICATION SERVERS: Websphere, BEA Weblogic

METHODOLOGY: Plug-in development, Structured, Procedural and Object Oriented Methodologies

IDE TOOLS: Eclipse, MS Visual Studio, Rational Development platform

PROFESSIONAL EXPERIENCE

Technical Lead /Hadoop developer

Confidential, New York

Responsibilities:

  • National Grid is an electricity and gas utility company catering to Northeast regions in US. As part of the smart grid strategy, National grid rolled out smart meters to some of its customers. Smart meters relay a lot of meter data back to the hub from individual customer location. This data could be used for various purposes like dynamic pricing, quicker outage notification and informed customer decisions, etc. Hadoop and other big data tools were reviewed for this purpose.
  • Data from smart meters was loaded into HDFS using SQOOP. Map reduce programs used this data to determine different meter data numbers that could be used for calculations involving power usage. The processed results were consumed by HIVE, Scheduling applications and various other BI reports through data warehousing multi-dimensional models.
  • Import and export of data using Sqoop from or to HDFS and Relational DB systems
  • Developed JAVA mapreduce programs for custom processing
  • Created Hive tables and wrote Hive queries using Hive QL
  • Developed work flows to schedule various Hadoop programs using Oozie.
  • Used Pig (PigLatin) scripts for ad-hoc data retrieval

Environment: Hadoop, SQOOP, Hive, Pig, Oozie, Oracle

Team Leader

Confidential, New York

Responsibilities:

  • Worked on scheduling software using Java/J2EE, Oracle on UNIX and Windows environment. These are used on an enterprise level for managing Electric and Gas leak work orders which are dispatched to the field crews in real-time and completion information passed back to the originating systems. This enterprise solution involves handling huge volumes of customer work orders, interfacing with different corporate and mobile units using multiple Middleware technologies.
  • Implementation involved java, scripts in the ASA framework(proprietary), Toplink and MQ for communication between components.

Environment: Java/J2EE, Oracle, Weblogic and iPlanet application servers, Middleware technologies

Confidential

Component Leader

Responsibilities:

  • As a Component lead responsible for Analysis, design, development, testing and implementation for the entire Confidential POWER system endpoints in the appliance. Confidential Smart Analytics System is an appliance that provides broad analytics capabilities on POWER systems warehouse foundation and storage. Deeply integrated and optimized, Confidential Smart Analytics System provides a single point of support for end-to-end analytics solutions on an enterprise scale. These systems and the analytic applications are managed by Confidential Systems Director. I designed and developed the interfaces for the POWER system endpoints in ISAS. Endpoints included AIX OS, HMC, and the actual physical server within the appliance. The ISAS system will be used by customers to develop analytic applications using Confidential Cognos and Confidential SPSS.

Environment: Rational Application Developer, AIX, Python scripting for automation

Component Leader

Confidential

Responsibilities:

  • This project show cases Cloud computing capabilities using Confidential POWER systems and storage. This was done in collaboration with Confidential research. It involved automated provisioning, resource allocation, de-allocation of pre-canned images which include the AIX operating system with other software like DB2, Websphere. I was involved with the development of interfaces that interact with HMC, NIM, and VIOS. These scripts were used by Tivoli Provisioning Manager Workflows to start provisioning and de-provisioning.

Environment: Perl scripting, Cloud type infrastructure on POWER system, AIX, Rational Team concert, J2EE

Manager

Component Leader

Responsibilities:

  • WPAR Manager is a tool to allow an administrator to create, clone, and remove WPAR (workload partitions within a Logical partition) definitions, or start and stop from an easy to use management interface. It also has check-point/restart enablement needed to relocate a WPAR from one system to another using Live Application Mobility. There are a set of api’s that would accomplish the core functionalities mentioned above. My work involved exposing these services as REST services.

Environment: OSGi, Java, JAX-RS(REST), Rational Application Developer, AIX, Eclipse Programming, Rational Software Architect.

Lead Developer

Confidential

Responsibilities:

  • Confidential Systems Director is a multi-platform management solution. It helps automate data center operations, unified management of Confidential servers, storage and network devices, and also physical and virtual resources with a consistent look. USMi is the underlying API layer which provides the management capabilities. I was involved in exposing the managing USMi layer as Command line interface and Web services interface (JSR-172).

Environment: OSGi, J2EE, JSR-172 web services, eclipse plug-in development, rational application developer, AIX

Developer

Confidential

Responsibilities:

  • Confidential Personal communications product provides traditional access to data and applications on different host systems like AS/400, Mainframe and VT terminals. I was the component lead designing and developing features in logical terminal which represents the host screen and their keyboard functions. My responsibilities included. Product development and product support for Confidential Personal Communications ranging from managing scope, planning, tracking, change control, aspects for my components
  • GITAM is an Operating Environment with Total Tamil Interface. GITAM has 3 modules: Kernel Layer, GUI Subsystems, Applications and Utilities. I was in the applications module of GITAM. I developed GITAM Virus scanner - A signature based scanner, GITAM clean disk - cleans the drive specified by deleting the unwanted files (this can also be specified by the user), GITAM backup utility- which compresses the files and takes backup in the location specified. This was awarded the best students project by Council for Scientific and Industrial Research (CSIR), India 2000.

Environment: C/C++, DJGPP (32 Bit C/C++ Compiler)

We'd love your feedback!