We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

2.00/5 (Submit Your Rating)

OhiO

SUMMARY

  • 7 years of experience in IT industry with almost 2 year of experience with Hadoop i.e. HDFS, Map Reduce and Hadoop Ecosystem components like Pig, Sqoop, Hive, Avro.
  • Experience in designing Big Data solutions by selecting the best suitable Hadoop ecosystem component for the given business requirement.
  • Good experience in writing Java Map Reduce programs, Pig Latin scripts, Hive queries Sqoop tools for various business use cases.
  • Good knowledge on Avro Schemas, Omniture Data, Ingestion, Transformation and Publication process in Hadoop.
  • well recognized in the organization for analytical and technical skills.
  • Good Team player, flexable to learn new Tools and Software as required
  • As part of my assignments I have been involved in all phases of development including requirements analysis, detailed design, coding, testing, implementation and also in troubleshooting
  • Good experience in Java development . Worked extensively in core java, Swing frame work, struts.
  • Strong experience in RDBMS technologie SQL, Stored Procedures, Triggers, Functions.
  • Worked in development of Java Axis webservices with XML, WSDL,SOAP.

TECHNICAL SKILLS

Hadoop ecosystem: HDFS, java MapReduce, PIG, HIVE,Sqoop,Avro,basics of HBase,Flume.

Hadoop Distribution: Hortonworks

Programming Languages: Java, PL/SQL, web services,java scrip,XMl

Databases: Oracle9i/10g.

Operating Systems: Linux, Unix, WINDOWS XP, 7, 8.

PROFESSIONAL EXPERIENCE

Hadoop Consultant

Confidential, Ohio

Responsibilities:

  • Working in agile, sucessfully completed stories related to injestion, transformation and publication of data on time.
  • Worked on map reduce code for Omniture(hit data capturing tool) and some more bussiness senarios,used maven to build all the jars.
  • Injested data set from different DBs and Servers using Sqoop Import tool and MFT(Managed file transfer) Inbound process.
  • Wrote PigLatin Scripts and Hive Queries using Avro schemas to transform the Data sets in HDFS.
  • Created datameer links or used Sqoop Export tool and MFT(Managed file transfer) outbound process to provide the transformed data to Clients.
  • Used Xpath serde to process XML data files.
  • Wrote a custom Record Reader for mapReduce program.
  • Used maven to build the Jars for mapreduce, Pig and Hive UDFs.
  • Created SVN usage guidlines for the team to maintain the Code repository in branches,trunk and tags.
  • Created Migration Process to productionise the developed code and standardised the Build to Run Document to hand over the code to run team for production support.

Confidential

Hadoop Consultant

Responsibilities:

  • Gathered the business requirements from the Business Partners and Subject Matter Experts.
  • Involved in loading the HDFS with log files, structured and unstructured data.
  • Wrote MapReduce utilities like sampler(creates sample of given data set),Substituter(Substitutes a perticular string with given string in a data set ) using Java.
  • Scheduled Sqoop,pig,Hive Jobs using Oozie
  • Imported data to HDFS using Sqoop on regular basis.
  • written PigLatin scripts, Created Hive tables and Wrote Hive queries for data analysis.
  • Active participation in code review sessions. Documented the design of all the different programs developed for future reference.

Confidential

Developer

Responsibilities:

  • It is pilot project in Abbvie, Management was look for some good analysis of the usage and cost savings on server. So we took the requirements from Subject Matter Experts.
  • Involved in installing Hadoop Ecosystem components.
  • Involved in loading the HDFS with log files.
  • Wrote MapReduce programs using Java to convert these log files information to a structured format.
  • Installed and configured Pig and also written Pig Latin scripts for some ad hoc requirements.
  • Installed and configured Hive. Created Hive tables and Wrote Hive queries for data analysis on a regular basis.

Confidential

Developer

Responsibilities:

  • Involved in Requirement Analysis, estimates, design, coding and unit testing.
  • Handled 3 major modules in the application by guiding 2 junior associates
  • As part of java we used java Siwngs, EJB, core java
  • Wrote Sql queries, stored procedures, triggers, functions.
  • Gave demos to customer.
  • On time delivery to client.
  • Involved in production deployments.
  • Involved in project java code reviews, Quality audits.

Confidential

Developer

Responsibilities:

  • Involved in the development of replica of the actual project using Swing.
  • Extensively used Core Java concepts.
  • Extensively used the layout managers like BorderLayout, GridBagLayout.
  • Written procedures for storing the data in the database.
  • Involved in the testing and bug fixing.
  • Wrote apache axis based web services which consists of aJava implementation of theSOAPserver. Key store certificate authentication mechanism
  • Created and modified the keystore certificates using java keytool
  • Created a presentation of the actual flow using Power Point.
  • Used websphere application service to deploy these applications.

Confidential

Application Developer J2EE

Responsibilities:

  • Developed JavaScript behavior code for user interaction.
  • Created database program in SQL server to manipulate data accumulated by internet transactions.
  • Wrote Servlets class to generate dynamic HTML pages.
  • Developed an API to write XML documents from a database.
  • Maintenance of a Java GUI application using Swing.
  • Created complex SQL and used JDBC connectivity to access the database.
  • Part of the team that designed, customized and implemented metadata search and database synchronization.
  • Used Oracle as Database and used Toad for queries execution and also Involved in writing SQL scripts, PL SQL code for procedures and functions.

We'd love your feedback!