We provide IT Staff Augmentation Services!

Hadoop Architect/ Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • 12 years of Experience in Enterprise Application Development, Web Applications, Client - Server Technologies, Web Programming .
  • More than 5 years of Experience in design, development, analyzing, maintenance and support of Big Data Analytics using Bigdata Hadoop enterprise version.
  • Worked as Big data Architect to design the Big data, Hadoop compatibility with other third party vendors and design the business use cases.
  • Involved in designing and building and establishing Enterprise Big data hub fit into five container database channel to preserve history and valuable data of decades for long time.
  • Extensively involved in building a cluster setup for both Horton works and Cloudera enterprise level.
  • Extensively involved in planning, executing POCs to find how Big data, Hadoop can help to solve the pain levels and performance events to process, storing, analyzing and securing sensitive data of all applications struggling with high intensity, variable, veracity, value of data.
  • Involved in technical POC’s to know data integration from Tableau, SAS data, Spotfire, Informatica to Bigdata Hadoop clusters for process data in either direction
  • Experienced in distributed systems to leverage CDH5.2.0, HDFS, MapReduce, analyzed performance bottlenecks and recommended tuning optimizations.
  • Proficient in Apache Hadoop ecosystems PIG, FLUME, Hbase, Zookeeper, Hive, impala, SQOOP, Solr, Spark, KAFKA, Apache Tika, strong understanding of HDFS architecture.
  • Strong working experience with integration, warehousing, querying, processing and analysis of large data which may be sensitive or raw information.
  • Experience with Hadoop Architecture and the daemons of Hadoop - MapReduce, HDFS, Job Tracker, Task Tracker, Name Node and DataNode.
  • Implemented Proofs of Concept on Hadoop stack and different big data analytic tools, migration from different databases (i.e. Teradata, Oracle, SASdata and MYSQL) to Hadoop.
  • Hands on experience in developing Map Reduce jobs using Hadoop Ecosystem.
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Experience in configuring Hadoop Clusters and HDFS.
  • Successfully loaded files to Hive and HDFS from Mongo DB, Hbase Loaded the dataset into Hive for ETL Operation.
  • In-depth understanding of Data Structure and Algorithms.
  • Experience in developing and deploying applications using Web logic, Tomcat and JBOSS.
  • Experience with backend databases like ORACLE, DB2, MYSQL, Mainframe and SQL Server
  • Team Player, quick learner and self-starter with effective communication, motivation and organizational skills combined with attention to details and business process improvements.
  • Hands on experience in setting up, configuring and using Apache Hadoop ecosystem components like HDFS, Hadoop MapReduce, Zoo Keeper, Oozie, Hive, Sqoop, Pig and Flume +.
  • Expertise in writing Hadoop Jobs for analyzing data using Hive and Pig
  • Experience in working with MapReduce programs using Apache Hadoop for working with Big Data
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems(RDBMS) and vice-versa
  • Broad experience with SQL, PL/SQL and database concepts
  • Experience in optimization of Map reduces algorithm using combiners and petitioners to present the best results.
  • Excellent programming skills in Java, JSP, JDBC, XML, HTML, XHTML, JavaScript and developing client-server, Web and Distributed Applications.
  • Experience in Database design, Entity relationships, Database analysis, Programming, SQL, Packages and Triggers in Oracle and SQL Server on Windows and UNIX.
  • Excellent written and oral communication, interpersonal skills and presentation skills. Proven ability to work efficiently in both independent and team work environments.

TECHNICAL SKILLS:

Big Data Ecosystem: CDH5.2.0, MapReduce, HDFS, Hbase, Zookeeper, Hive, Pig, Sqoop, Oozie and Flume

Programming Languages: Java, SQL,PL/SQL,UNIX/Linux Shell Scripts

Web Technologies: HTML, XML, JavaScript, JSON

Framework: JUnit, log4j, Spring, Hibernate

Database: Oracle, DB2,MySQL,Hbase, Mongo DB

Application Server: Apache Tomcat 5.5.0, Tableau

IDE’s, Utilities & Web: Eclipse, HTML,CSS, Java Script

Operating Systems: LINUX, Windows 7, UNIX

Methodologies: Agile, UML, OOP

Protocols: TCP/IP, HTTP, SOAP and HTTPS

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Architect/ Developer

Responsibilities:

  • Determine feasibility requirements, compatibility with current system, and system capabilities to integrate new acquisitions and new business functionalities.
  • Confidential ’s current data constituent base can be grouped into three principal categories:
  • Research, exploratory modeling & predictive analysis.
  • Industrialized analysis like Risk & modeling, analytics like PRIMA, PRISM etc.
  • Established system of record based consumption patterns (RDW’s marts & formal reporting functions including financial, regulatory and other).
  • Independently formulate detailed program specifications using structured data analysis and design methodology. Prepare project documentation when needed.
  • Independently code new programs and design Tables to load and test the program effectively for the given POC’s using with Big Data/Hadoop along with the following technical skills like Hive, HDFS, Impala, Hue, Solr, Jason Scripts, Spark, Cloudera Manager, Cloudera Navigator to deliver complex systems issues or changes.
  • Develop detailed application designs and specifications for computer applications. May assist in technical lead capacity during project development in POC’s phase.
  • Write documentation describing program modifications, logic and corrections. Oversee development of user manuals and operating procedures. Provide technical assistance to resolve operating issues.
  • Extracting, transferring and loading the data from different sources to build the right solutions for Hadoop Projects.
  • Designed and analyzing the Business use cases to provide the right solutions for all the POC’s used in Hadoop Projects.
  • The above responsibilities are complex and involve the theoretical and practical application and highly specialized knowledge.
  • Replaced default Derby metadata storage system for Hive with MySQL system. Executed queries using Hive and developed Map-Reduce jobs to analyze data.
  • Developed Pig Latin scripts to pull the information from the web server output files to load into HDFS. Built up the Pig UDF's to preprocess the information for analysis.
  • Developed Hive queries for the analysts. Involved in loading data from LINUX and UNIX file system to HDFS.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig.

Environment: CDH5.2.0, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.13.0, spark,impala.Solr1.4,apache Tika, python, UBUNTU( Linux),Eclipse Juno, XML, JSON,Enterprise data hub.

Confidential

Hadoop Developer

Responsibilities:

  • Converting the existing relational database model to Hadoop ecosystem.
  • Generate datasets and load to HADOOP Ecosystem
  • Design technical architectural workflow
  • Optimize the Hive queries with proper algorithms and build customer attributes using HIVE.
  • Integrating the Hive queries with OOZIE
  • Compare the Hive queries output to existing data model outputs.
  • Generate datasets and load to HADOOP Ecosystem
  • POC on data Ingestion with different tools.
  • Follow agile methodology for the entire project.
  • Designed and developed MapReduce programs with Java.
  • Develop UDF (java) and UDAF (java)
  • Orchestrate hundreds of HIVE queries using Oozie workflows.
  • Analyze customer patterns based on the attributes.
  • Follow agile methodology for the entire project.
  • Conduct scrum calls every day
  • Prepare technical design documents, detailed design documents.

Environment: Hadoop 0.2.0 MR1, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.7.1, Java 1.6, Linux, Spring 3.2.3, Eclipse Juno, XML, JSON.

Confidential

Technical Lead

Responsibilities:

  • Called for in the blueprint of the Technical Business Design Documents based on the Conceptual System Design.
  • Meet the major office to manage the critical application during knowledge transition and shadowing and reverse shadowing time.
  • Responsible for getting up the detail planning and system appreciation document preparation of several applications and all its subsets.
  • Perform business area analysis, including subjects of business plans in order to enforce the theory and rules of performance improvement to customer offices
  • Recognize significant problems and opportunities in clients’ operations and give understanding of clients’ systems and procedures, overall business operations and industries in the current.
  • Responsible for utilizing technological analysis and design principles to formulate detailed application plans and procedures in order to implement clients' requests for new or modified functionalities
  • Analyze clients' requests for new or modified applications through interviews and design sessions
  • Design, produce and test proposed enhancements with client interaction for verification
  • Developed software programs using JCL, COBOL and DB2.

Environment: Z/OS, DB2, COBOL, JCL, CICS, VSAM, SPUFI, QMF, SQL, ACF2, SHAREPOINT, XPEDITOR

Confidential

Senior technical Analyst

Responsibilities:

  • Called for in the blueprint of the Technical Business Design Documents based on the Conceptual System Design. Coded complex Cobol/DB2 modules that do dynamic allocation of the files.
  • Provided escalated support by investigating and solving complicated business application issues and their associated organizations.
  • Developed an expert understanding of technical and business process flow of applications and provided recommendations for improvement.
  • Provided with appropriate communication, facilitate bringing other parties together, and complete post resolution reporting for high severity incidents
  • Participated in Development efforts, provide input for future requirements, and inform the Service Desk or others of the Release Notes and any known events.
  • Extensively worked on Problem Investigation, Analysis and development of certification for the existing or new modules.
  • Setup, configure, maintain and monitor assigned business application(s) and related systems and worked in Mainframe transaction facility protocols.
  • Working with clients to obtain the occupation requirements and enhancements in the organization.
  • Involved in production batch support like scheduling the jobs, restarting, fixing abends and bypassing the cases. Setting up of bugs & ensuring timely and defect free delivery.
  • Coordinated with interface teams to clear the technical/business doubts.

Environment: OS/390, TSO/ISPF, VSAM, COBOL, JCL, DB2, PLATINUM, CMTS, SPUFI, Toad, JIRA, SQL Explorer, MS-Office, MS Project.

Confidential

Program Analyst

Responsibilities:

  • Responsible for implementing, customizing, and integrating components of the client application
  • Design, produce and test proposed enhancements with client interaction for verification
  • Monitor operation and functionality throughout the execution process by testing applications to ensure optimum user benefits, design and configure application modifications and enhancements as necessary
  • Plan and create web front-end applications to integrate with host-side operations
  • Implement the integration and customization of customer-specific system packages
  • Provide first level production support post “get live” operations
  • Integrate and program middleware concepts

Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL, ACF2, SHAREPOINT, MICROSOFT OFFICE (WORD, EXCEL, ACCESS).

Confidential

Mainframe Analyst

Responsibilities:

  • Research, update and maintain quality testing measures and routines.
  • Assist in the planning, creation and control of the test environment(s).
  • Identify, collect and create test data.
  • Facilitate and participate in structured walk-through and peer critiques.
  • Take part in the coordination and implementation of system and assembly testing.
  • Inform Test Architect and Team Lead of any events that may bear upon the schedule, budget, or quality of the ware and the testing process.
  • Validate fixes, execute test scripts, as well as record problems and events in accordance with the project's problem and issue management plans
  • Project management, testing and reporting of outcomes
  • Document all the testing results and maintain ABEND logs.
  • Assist and coordinate with the new resources in understanding the nature of work and the testing procedures.
  • Created conceptual design, test approach, test plan, test plan and test scripts manually and Mercury Quality Centre and CMTS. Setup and manage test environments and test data based on the exam requirements. Coordinating with onshore & Client communication.

Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL.

We'd love your feedback!