We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

2.00/5 (Submit Your Rating)

San Antonio, TX

SUMMARY

  • Overall 7+ years of experience in Software Development Lifecycle (SDLC), which include 2+ years Hadoop development and Analysis, Design, Implementation, Testing and Deployment of Web - Based, Distributed and Enterprise Applications withJava/J2EEtechnologies.
  • Knowledge on Hadoop Clusters -Cloudera and Hadoop Hortonwork distributions.
  • Understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
  • Knowledge in Installation, Configuration and migrating and upgrading of data from Hadoop MapReduce, HIVE, HDFS, HBase, Sqoop, Oozie, Pig, Cloudera, YARN, Zookeeper, Flume.
  • Have knowledge in writing MapReduce jobs in Java, Pig .
  • Well versed in importing and exporting data from and into HDFS using Sqoop and then processing the data which is in schema or non-schema oriented using PIG.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
  • Involved in developing multiple jobs using Oozie Workflow Engine which runs Hadoop Map-Reduce and Pig Jobs.
  • Familiar in Spark tools like RDD transformations and spark QL.
  • Experienced in application design using Sequence diagrams, Case diagrams, Entity Relationship Diagrams (ERD), Data Flow Diagrams (DFD).
  • Expertise in design and development of various web and enterprise applications using various technologies like JSP, Servlets, Struts, Hibernate, Spring, JDBC, JSF, XML, AJAX, SOAP and Web Services.
  • Proficiency in programming with different Java IDE's like Eclipse, JDeveloper and NetBeans.
  • Experience in using of web/application servers Apache Tomcat, Web Logic and WebSphere.
  • Experience in working with different databases like Oracle and MYSQL along with exposure to Hibernate, JDBC for mapping an object-oriented domain model to a traditional relational database.
  • Experience in web-based languages such as HTML, CSS, XML and other web methodologies and knowledge in Web Services and SOAP.
  • Experience in supporting production tasks, data enhancements and code fixes.
  • Familiar in using SCRUM methodology, Test Driven Development, Pair Programming and Continuous Integration.
  • Documented design procedures, operating instructions test procedures and troubleshooting procedures
  • Adaptable to any environment, self-motivated, collaborative with excellent interpersonal & communication skills.
  • Excellent interpersonal skills, good experience in interacting with clients with good team player and problem solving skills.
  • An individual with excellent interpersonal and communication skills, strong business acumen, creative problem solving skills, technical competency, team-player spirit, and leadership skills.

TECHNICAL SKILLS

Big Data: Hadoop, MapReduce, HDFS, YARN, Hive, Pig, Sqoop, Spark, Zookeeper and HBase

Languages: Java, SQL, HTML, CSS, JavaScript, XML, C/C++.

Java/J2EE Technologies: JSP, Servlets, JavaBeans, JDBC, JNDI, JTA, JPA, EJB 3.0

Design Patterns: MVC, Data Access Object, Data Transfer Object / Value Object, Business Delegate

Web Design Tools: HTML,CSS, AJAX, JavaScript and CSS

Frameworks: Struts 1.1/2.0, Spring 2.5, Hibernate 3.0

Servers: Web Logic Server 10.3, Tomcat 5.5/6.0

IDEs: NetBeans, Eclipse, JDeveloper, SQL Developer.

Databases: Oracle 9i/10g/11g, MS-SQL Server.

Operating systems: Windows XP/NT, Linux, UNIX, DOS

PROFESSIONAL EXPERIENCE

Confidential, San Antonio, TX

Hadoop Consultant

Responsibilities:

  • Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
  • Developed data pipeline using Flume, Sqoop, Pig and Java MapReduce to ingest customer behavioral data and financial histories into HDFS for analysis.
  • Developed PIG Latin scripts to extract data from the web server log files.
  • Created Hive External tables and loaded the data in to tables and query data using HQL.
  • Collected the logs data from web servers and integrated in to HDFS using Flume.
  • Developed Oozie workflow to run multiple Hive Script.
  • Worked on Hive for exposing data for further analysis and for generating transforming files and storing in different file formats.
  • Worked on hive Partitioning tables
  • Developed PIG scripts to using various transforms.
  • Create data definitions for new database file/table development and/or changes to existing ones as needed for analysis
  • Worked on importing and exporting data from Oracle and mysql into HDFS and HIVE using Sqoop.
  • Created Hive External tables and loaded the data into tables and query data using HQL.
  • Developed PIG Latin scripts to read the data from the hdfs files to load into HDFS.
  • Worked on tuning the performance of Hive and Pig queries.
  • Utilized Agile Scrum Methodology to help manage and organize a team of developers with regular code review sessions.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

Environment: Hadoop, HDFS, Hive, Pig, Sqoop, Map Reduce, Oozie, Cloudera, Oracle 11g and Linux RedHat.

Confidential, Charlotte, NC

Hadoop/Big Data developer

Responsibilitie

  • Responsible for loading unstructured data into Hadoop File System (HDFS).
  • Loaded data into the cluster from dynamically generated files using Flume.
  • Loaded data into HDFS from relational database management systems using Sqoop.
  • Implemented complex buisness logic using Pig Latin Scripts.
  • Designed Hive Partition & bucket tables
  • Worked on AVRO file formats
  • Developed applications using hive queries.
  • Developed applications using mapreduce jobs.
  • Data is collected from Teradata and pushing into Hadoop using Sqoop.
  • Pulled data from FTP server to load data into hdfs, using Oozie workflows.
  • Maintained documentation for corporate Data Dictionary with attributes, table names and constraints.
  • Extensively worked with SQL scripts to validate the pre and post data load.
  • Responsible for post-production support and SME to the project.
  • Involved in the System and User Acceptance Testing.

Environment: Hadoop, Pig, Hive, Sqoop, HBase, HDFS, Flume, Sqoop, Map Reduce, Mongo DB, NoSQL, horton works

Confidential, Mountain View, CA

Java/J2EE Developer

Responsibilities:

  • Designed and developed backend Java Components residing on different machines to exchange information and data using JMS.
  • Design and development of Application user interface using Core Java, Servlets and JSP.
  • Designed and Implemented MVC architecture using Struts Framework, Coding involves writing Action Classes/Forms/Custom Tag Libraries, JSP.
  • Involved in Coding JSP and Servlets.
  • Designed and developed GUI using JSP, HTML and CSS.
  • Wrote Servlets to fetch and manipulate data from the database.
  • Utilized Servlets to handle various requests from the client browser and send responses.
  • Used JavaScript for client side validations.
  • Used Hibernate for handling database transactions and persisting objects.
  • Deployed application on Weblogic Application Server.
  • Used JDBC for database connectivity with Oracle.
  • Created stored procedures and Triggers using PL/SQL. Created tables, Views and other database objects in the Oracle database.
  • Involved in debugging and testing of the application. Provided production support to the end users and performance tuning.

Environment: Java, Servlets, JSP, HTML, Hibernate, Weblogic, Eclipse, Java Script, XML, CSS, SQL, PLSQL, Oracle and Windows.

Confidential 

Java Developer

Responsibilities:

  • Developed the application using the JDeveloper as Integrated Development Environment.
  • Communicates project needs to client and internal employer personnel to ensure quality and timely.
  • Created the database using JDBC which includes Creating, Updating and Inserting the data.
  • Developed the User interface using HTML, CSS and JavaScript.
  • Developed the Web Application UsingJavaJ2EEtechnologies which include Servlets and JSP's.
  • Creates and maintains project documentation.
  • Generated reports using Jasper reporting tool.
  • Wrote SQL queries for creating, inserting, modifying the data for Data Manipulation.
  • Participated in GUI validation using JavaScript coding and worked on Struts framework.
  • Developed the application using Framework that influences MVC architecture.
  • Developed Web Pages using JSP, HTML, Java Script and CSS.
  • Used axis tool to generate stubs from the WSDL files to access the Webservices.
  • Tested the application and done correctness.
  • Documented the project as per the specifications.
  • Bugs Fixing and maintenance.
  • Microsoft Visual source safe for code check-in. Made changes to the code and checked in.

Environment: JDeveloper, Windows XP, weblogic portlets, Oracle, JSP, Servlets, Toad, JavaScript, HTML, Weblogic server, Jasper Reports, Struts, Microsoft Visual source safe.

We'd love your feedback!