We provide IT Staff Augmentation Services!

Hadoop Developer\data Analyst Resume

2.00/5 (Submit Your Rating)

Tampa, FloridA

SUMMARY

  • 11+ years of Technical experience in core product design, development and testing and played various roles such as developer,Technology lead, SOA architect, Developer Big Data on very large projects.
  • 3 Years of experience in Hadoop ecosystem, Hands on Big Data experience in Apache Hadoop, Apache Pig, Map Reduce, Hive, ZooKeeper, Impala, SOLR, Oozie and Sqoop.
  • Expert skill level in the utilization of SDLC and knowledgeable of Waterfall and Agile Methodology.
  • Developed Map Reduce programs to parse the raw data, populate staging tables and store theirfined data in partitioned tables in the EDW.
  • Created Hive queries dat halped market analysts spot emerging trends by comparing fresh data wif EDW reference tables and historical metrics.
  • Experience on working wif both structured and unstructured data.
  • Worked on billions of records of aggregated data and transferred to HDFS using Flume.
  • Expertise in dealing wif various file formats including text, LZO, Sequence File, JSON and AVRO.
  • Immensely experienced in writing and implementing Pig scripts/Hive, UDFs & UDAFs.
  • Implemented Hadoop based data warehouses, integrated Hadoop wif Enterprise Data Warehouse systems and built real - time Big Data Solutions.
  • Taught courses and seminars on Big Data.
  • Good object oriented design and development skills. 4 years of experience as a JAVA/J2EE Developer in Software Analysis, Design, Development and Implementation of various business applications dat use Java/J2EE technologies.
  • Extensive experience in developing java based applications using J2SE and Spring Batch, J2EE components like Web Services, Spring, Hibernate, Struts, JSP, JSTL Servlets and Swing.
  • Experience in developing and deploying applications using servers such as Apache Tomcat, Oracle Weblogic, IBM WebSphere and JBoss, tools such as TOAD and SQL Developer for database development and interaction, using IDE's such as Eclipse, Net Beans, WSAD and JBuilder.
  • Working knowledge in XML related technologies like XML, DTD, XSD, X-Query, XSLT, X-PATH, XML parsing using SAX and DOM.
  • Has excellent problem identification skills using JUnit, Log4j and Ant.
  • Expertise in using DB2, Oracle, Sybase, PL/SQL and SQL Database Servers.
  • Experience in Configuration Management tools like ClearCase, CVS, SVN and MS VSS.
  • Strong interpersonal and organizational skills in addition to an ability to manage and lead multiple projects to succession wif good analytical and problem-solving skills
  • Excellent co-ordination ability and team work wif diverse and off-shore teams.
  • Possesses exceptional interpersonal, verbal, and written communication skills dat motivate all members of the team to deliver outstanding bottom line results.

TECHNICAL SKILLS

Big Data: Hadoop, HDFS, MapReduce, Apache Pig, Sqoop, Hive, Flume, ZooKeeper, Oozie, MPP, Cloudera’s CDH4, HBase & MongoDB

Application Development Tools: TOAD 9.7/9.0, SQL Developer, SQL* PLUS, Eclipse- Kepler IDE

Programming Languages: Java, SQL, PL/SQL, C, Pig, HQL,VBScript, Groovy

Data Modelling Tools: ETL, Visio, Microsoft Office Suite.

Relational Databases: Oracle 11g/10g, SQL Server 2008, MySQL 5.6.2

NoSQL: Hbase,MongoDB

Operating Systems: Windows vista/XP/2003/2000/NT 4.0/9x, MS-DOS

Industries: Financial, Healthcare, Telecommunication

Areas/Applications: Java, .Net based desktop and web based applications.

PROFESSIONAL EXPERIENCE

Hadoop Developer\Data Analyst

Confidential, Tampa, Florida

Responsibilities:

  • Joined data from logs by creating MapReduce programs to analyze the properties dat are most talked about, the competitors for these lease and also wat leases are talking about.
  • Created a Hive tables to facilitate joins and used Pig to store the results.
  • Transferred the result from Hive to Oracle using Sqoop theirby allowing the downstream to use the consistent data.
  • Transferred data from Databases to HDFS using Sqoop.
  • Used Flume to stream through the log data from various sources.
  • Stored the data in the tabular formats using the Hive tables.
  • Wrote data ingesters and MapReduce programs.
  • Involved in moving all log files generated from various sources to HDFS for further processing.
  • Processed the HDFS data creating and using the Apache Pig scripts.
  • Monitored the Hadoop scripts which take the input from HDFS and load the data into Hive.
  • Retrieved the booking data from the data loaded in HDFS using Sqoop.
  • Used Hive Serdes to analyze JSON.
  • Developed MapReduce programs to clean and parse the data in HDFS dat is obtained from various data sources.
  • Created Hive tables as per requirement dat were internal or external tables defined wif proper static and dynamic partitions, intended for efficiency.
  • Big data analysis using HIVE, Pig and User defined functions (UDFs).
  • Coordinated wif Hadoop Administrator in implementing Hadoop security using Kerberos authentication.
  • Performed Map side joins and other operations using MapReduce.

Hadoop Developer

Confidential

Responsibilities:

  • Established 8 node clusters and explored how specific workloads could benefit the native parallelism in the Hadoop environment.
  • Created a “Proof-of-Concept” dat shows how these technologies can be integrated into the enterprise existing architecture
  • Integrated log-data, enterprise data sources like MySQL, Oracle, Netezza, Teradata and Others wif Hadoop.
  • Used Sqoop to load existing data warehouse data from Oracle database to HDFS.
  • Pilot new analytical capabilities and use cases to prove business value and inform a long-term roadmap to compete on analytics
  • Involved and actively interacted wif cross-functional teams like Web Team, UNIX and DBA teams for successful Hadoop implementation.

Tech lead

Confidential

Responsibilities:

  • Design, developing a functional/technical specification, integration and functional test support and deployment for spring batch application modules
  • Participated in the entire Software Development Life Cycle (SDLC) of projects using Object Oriented ANALYSIS and Design (OOAD)
  • Used PL/SQL extensively to generate Stored Procedures and Functions to use wif JAVA.
  • Wrote technical specifications, designed and developed major Java Batch Job for client using Core Java, spring.
  • Proactively managed systems development, implementation and risk via the use of SDLC.
  • Responsible for Offshore co-ordination, Technical support and maintenance of daily batch process. Daily feeds are sourced from upstream mainframe systems.

Confidential, Willington, Delaware

Responsibilities:

  • Proactively managed systems development, implementation and risk via the use of SDLC.
  • Managed 24 technical resources.
  • Understand the project requirements and submit project execution plan
  • Attend meetings and work wif various business groups to determine customer requirements and set time lines.
  • Review and submit solution to client based on High Level Gap Analysis Documentation
  • Communicate wif business and technical users
  • Estimation, Planning, Monitoring & Scheduling
  • Coach, mentor and lead personnel wifin a technical team environment.
  • Responsible and accountable for the coordinated management project directed toward strategic business.

Analyst Programmer

Confidential

Responsibilities:

  • Designed and developed forms using HTML and JSP.
  • Developed server side code using servlets.
  • Client side validations using java script. JUnit testing.
  • Developed SQL queries to fetch data from the backend system.

We'd love your feedback!