We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

Atlanta, GA

PROFESSIONAL SUMMARY:

  • 8+ years experience in Big Data Analytics and Software development expertise.
  • 3 years of hands on experience in Hadoop Framework and its ecosystem likeMapReduce Programming, Spark,Hive,impala, Pig, Sqoop, Hbase, Oozie,Impala,Scala,Python.
  • Experience in Amazon, Horton works and ClouderaHadoopdistrubutions
  • Experience in strong and analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems andvice - versa.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Architected, Designed and maintained high performing ELT/ETL Processes.
  • Tuning, and Monitoring Hadoop jobs and clusters in a production environment.
  • Manage and review Hadoop log file.
  • Good experience installing, configuring, testing Hadoop ecosystem components.
  • Good work experience on JAVA, JDBC, Servlets, JSP.
  • Proficient in Java, J2ee, JDBC, Collections, Servlets, JSP, Struts, Spring, Hibernate, JAXB, JSON, XML,XSLT, XSD, JMS, WSDL, WADL, REST, SOAP Web services, CXF, Groovy, Grails, Jersey, Gradle andEclipse Link
  • Hands on experience in implementing M.V.C Architecture using Struts, Spring, Jersey and GrailsFrameworks.
  • Participated in an Agile SDLC to deliver new cloud platform services and components
  • Developing and Maintenance the Web Applications using the Web server Tomcat.
  • Exceptional ability to learn new technologies and to deliver outputs in short deadlines.
  • Having Experience on UNIX commands and Deployment of Applications in Server.
  • Managing Hadoop Services like Namenode, Datanode, Jobtracker, Tasktracker
  • Configuring Namenode high availability minimize down time
  • Developing the Backup & Recovery strategies
  • Writing Pig scripts for data processing.
  • Regular backups of FSIMAGE & EDIT logs.

TECHNICAL SKILLS:

Hadoop: Hadoop2.2, HDFS, MapReduce, Pig 0.8, Hive0.13, Hbase 0.94, Sqoop 1.4.4, Zoopkeeper 3.4.5, Yarn,Solar,Elasticsearch,Kabana,Spark,Scala,Impala

Hadoop management & Security: HortonworksAmbari, Cloudera Manager, Apache Knox, XA Secure

Web Technologies: DHTML, HTML, XHTML, XML, XSL (XSLT, XPATH), XSD, CSS, JavaScript

Server SideScripting: UNIX Shell Scripting,

Database: Oracle 10g, Microsoft SQL Server, MySQL, DB2,Optima,TeradataSql

Programming Languages: Java, J2EE, JSTL, JDBC 3.0/2.1, JSP 1.2/1.1, Java Servlets, JMS, JUNIT,Log4j

Web Servers: Apache Tomcat 5.x, BEA Weblogic 8.x, IBM Websphere 6.0/5.1.1

IDE: WSAD5.0, IRAD 6.0, Eclipse3.5, Dreamweaver 13.2.1

OS/Platforms: Mac OS X 10.9.5,Windows2008/Vista/2003/XP/2000/NT,Linux(All major distributions), Unix.

Client side: JavaScript, CSS, HTML, JQuery

XML: XML, HTML, DTD, XML Schema, XPath

Methodologies: Agile, UML, Design Patterns, SDLC

PROFESSIONAL EXPERIENCE:

Hadoop Administrator

Confidential, Atlanta, GA

Environment: Mapreduce,Pig,Hive,Elastic Search, kabana,Python,R, Scala,Sqoop

Responsibilities:

  • Responsible for data gathering from multiple sources like Teradata,Oracle, Sql server etc.
  • Responsibe for doing validiations and cleansing the data.
  • Finding the right joins logics and createvaluble data sets for further data analysis.
  • Worked extensively on Pig and hive.
  • Responsible to develop custom udf’sin pig, hive.
  • Used Piggy bank and Data Foo collection to for data analysis on Hadoop.
  • Used Elephantbirdopen source library for data analysis
  • Worked on Streaming,HadoopStreaming,Pig Streaming.
  • Worked on ORC,Parquet file formats.
  • Developed complex Map reduce jobs usingapache commons components.
  • Generated reports and created graphs using kabana and python .
  • Worked on Numpy,Scipy,matplotlib modules in python.
  • Used elastic search and kabana to visualize the reports.
  • Used bcp and scoop to fetch the data.
  • Used jumbune packages for data validation.
  • Used Scala and worked on immutable data typeslike map/flatmap/foldto transform through various data structures.
  • Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Supported code/design analysis, strategy development and project planning.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Assisted with data capacity planning and node forecasting.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
  • Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.

Hadoop Admin

Confidential, Atlanta GA

Responsibilities:

  • Work with the Teradata analysis team to gather the business requirements.
  • Worked extensively on importing data using scoop and flume.
  • Responsible for creating complex tables using hive.
  • Created partitioned tables in Hivefor best performance and faster querying.
  • Transportation ofdata to Hbase using pig.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Experience with professional software engineering practices and best practices for the full software development life cycle including coding standards, code reviews, source control management and build processes.
  • Worked collaboratively with all levels of business stakeholders to architect, implement and test Big Data based analytical solution from disparate sources
  • Involved in source system analysis, data analysis, data modeling to ETL (Extract, Transform and Load)
  • Written multiple MapReduce procedures to power data for extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV & other compressed file formats.
  • Handling structured and unstructured data and applying ETL processes.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS
  • Developed the Pig UDF'S to pre-process the data for analysis.
  • Develop Hive queries for the analysts
  • Prepare Developer (Unit) Test cases and execute Developer Testing.
  • Create/Modify shell scripts for scheduling various data cleansing scripts and ETL loading process.
  • Supports and assist QA Engineers in understanding, testing and troubleshooting.
  • Written build scripts using ant and participated in the deployment of one or more production systems
  • Production Rollout Support which includes monitoring the solution post go-live and resolving any issues that are discovered by the client and client services teams.
  • Designed, documented operational problems by following standards and procedures using a software reporting tool JIRA.

Environment: Apache Hadoop, Java (jdk1.6), Flat files, Oracle 11g/10g, MySQL, Windows NT, UNIX, Sqoop, Hive,Impala,Oozie.

Hadoop Admin

Confidential, Madison - WI

Responsibilities:

  • Analyzed large data sets by running Hive queries and Pig scripts
  • Implemented of Hadoop Cluster using Cloudera Distributed Vendor (CDH 4)
  • Managing Hadoop Services like Namenode, Datanode, Jobtracker, Tasktracker
  • Configuring Namenode high availability minimize down time
  • Developing the Backup & Recovery strategies
  • Writing Pig scripts for data processing.
  • Regular backups of FSIMAGE & EDIT logs.
  • Provide user support through meetings, presentations, and preparation of documentation
  • Performing data migration from rdbms to hbase using sqoop
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing withPig.
  • Mentored analyst and test team for writing Hive Queries.
  • Develop and maintains complex outbound notification applications that run on custom architectures, usingdiverse technologies including Core Java, J2EE, SOAP, XML, JMS, JBoss and Web Services.
  • Involved in running Hadoop jobs for processing millions of records of text data
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades asrequired
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Responsible to manage data coming from different sources.
  • Assisted in exporting analyzed data to relational databases using Sqoop
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hivequeries and Pig Scripts

Environment: Hadoop, Java, JBoss, HDFS, Pig, Hive, MapReduce, Sqoop, LINUX.

Sr.J2EE consultant

Confidential, Kansas City, MO

Responsibilities:

  • Involved in Design the Project Structure, System Design and Every phase in the project.
  • Responsible for developing Platform related logic and Resource classes, controller classes to access theDomain and Service classes.
  • Involved in the Development of Interfaces and services for the App Builder functionality.
  • Involved in technical discussions, design and workflow.
  • Participate in the requirement gathering and analysis.
  • Used Jaxb to unmarshall XML into Java Objects.
  • Developed Unit test cases using Junit Framework.
  • Implemented the DataAccess Using Hibernate and Wrote the Domain Classes to generate the databasetables.
  • Involved in implementation of view pages based on XML Attributes using normal java classes.
  • Involved in Integration of App builder and UI modules with the platform.
  • Used GitHub as a code repository.
  • Used Gradle as a build tool.
  • Implemented JQuery and Ajax for form submissions and design.

Environment: Jaxb, Junit Framework, XML, GitHub, Gradle, JQuery, Ajax.

Java Developer

Confidential - Houston, TX

Responsibilities:

  • Involved in the process Design, Coding and Testing phases of the software developmentcycle.
  • Designed use-case, sequence and class diagram (UML).
  • Developed rich web user interfaces using JavaScript (pre-developed library).
  • Created modules in Java and C++, python.
  • Developed JSP pages with Struts framework, Custom tags and JSTL.
  • Developed Servlets, JSP pages, Beans, JavaScript and worked on integration.
  • Developed SOAP/WSDL interface to exchange usage and Image and terrain information from Geomaps.
  • Developed Unit test cases for the classes using JUnit.
  • Developed stored procedures to extract data from Oracle database.
  • Developed and maintained Ant Scripts for the build purposes on testing and productionenvironments.
  • Involved in building and parsing XML documents using SAX parser.
  • Application developed with strict adherence to J2EE best practices.

Environment: Java, C++, Python, JavaScript, Struts, Spring, Hibernate, SQL/PLSQL, Web Services, WSDL,Linux, Unix.

Java and Salesforceconsultant:

Confidential, Chicago, IL

  • Analyzed requirements, Involved in the development of all modules.
  • Interacted with the client and getting requirements.
  • Created Custom Controllers and Standard Controllers in Visual Force.
  • Customized different page layouts and assigned them for different profile users
  • Customized tabs for among different business users groups and centers.
  • Creating Workflow Rules, Tasks, Email Alerts, and Components to suit to the needs of the application.
  • Schedules the Reports, Dashboards for Management and all the Department heads by emails.
  • Conducted all data migration using the salesforce.com import tool. Migrated data from MS Excel / CSV files to SFDC using Apex Data Loader.
  • Designed and Developed Apex Programme, Apex Triggers for various functional needs in the application.
  • Designed various WebPages in Visual Force for functional needs within Sales force.
  • Involved in Unit Testing and Test Coverage for Triggers.

Java Developer

Confidential - Chicago

Responsibilities:

  • Involved in gathering and analyzing system requirements.
  • Designed the application using Front Controller, Service Controller, MVC, Factory, Data Access Object, and Service Locator.
  • Developed the web application using Struts Framework.
  • Developed entire application based on STRUTS framework and configured struts-config.xml, web.xml.
  • Created tile definitions, struts-config files and resource bundles using Struts framework.
  • Implemented validation framework for creation of validation.xml and used validation-rules.xml.
  • Developed Classes in Eclipse for Java, using the Class Specification provided in Rational Rose.
  • Designed, developed and deployed necessary stored procedures, Functions, views in Oracle using TOAD.
  • Developed JUnit test cases.

Environment: Unix Shell scripting, Core Java, Struts, Eclipse, J2ee, Jboss Application server and Oracle, JSP, JavaScript, JDBC, Servlet, Unified Modeling Language, Toad, JUnit.

Hire Now