We provide IT Staff Augmentation Services!

Hadoop Developer\data Analyst Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • 10 Years of Total IT Experience in Java/J2EE, Hadoop Big Data.
  • 3 Years of Hadoop Big Data Experience.
  • 7 Years of Java/J2EE developer Experience.
  • Strong Knowledge on Big Data experience in Apache Hadoop, Apache Pig, Map Reduce, Hive, ZooKeeper, Impala, Oozie and Sqoop.
  • 3 years of Big Data experience in Apache Hadoop, Apache Pig, Map Reduce, Hive, ZooKeeper, Impala, Oozie and Sqoop.
  • Experience on working 2+ petabytes of both structured and unstructured data.
  • Worked on billions of records of aggregated data and transferred them to HDFS using Flume.
  • Worked on Hadoop, HDFS, HBase, Data Warehousing, RDBMS & NoSQL.
  • Expertise in dealing with various file formats including text, LZO, Sequence File, JSON and AVRO.
  • Worked on Data Warehousing, ETL, BI, Datamarts, Big Analytics
  • Experience in working with multiple clusters and nodes (including Namenodes & DataNodes).
  • Experience in Data/Big Analytics using Hive, Pig.
  • Expertise in writing and testing MapReduce programs to structure the data.
  • Created and implemented Mappers, Reducers, JobTrackers, TaskTrackers & Workflows.
  • Immensely experienced in writing and implementing Hive/Pig scripts & UDFs.
  • 7 years of experience as a JAVA/J2EE Developer in Software Analysis, Design, Development and Implementation of various business applications that use Java/J2EE technologies.
  • Expertise in design and development of various web applications with N - Tier Architecture using MVC and J2EE Architecture techniques.
  • Good experience with a complete life cycle development including migration issues.
  • Extensive experience in developing web based application using J2EE components like Web Services, Spring, Hibernate, iBatis, JAXB, JSF, Struts, EJB, RMI, JNDI, JMS, JPA, JTA, Tiles, JSP, JSTL Servlets and Swing.
  • Experience in developing and deploying applications using servers such as Apache Tomcat, Oracle Weblogic, IBM WebSphere and JBoss, tools such as TOAD and SQL Developer for database development and interaction, using IDE's such as Eclipse, Net Beans, WSAD and JBuilder.
  • Working knowledge in XML related technologies like XML, DTD, XSD, X-Query, XSLT, X-PATH, XML parsing using SAX and DOM.
  • Have excellent problem identification skills using JUnit, Log4j and Ant.
  • Expertise in using DB2, Oracle, Sybase, PL/SQL and SQL Database Servers.
  • Extensive scripting experience using UNIX, shell and Perl scripts.
  • Experience in Configuration Management tools like ClearCase, CVS, SVN, PVCS and MS VSS.
  • Expertise in web-based GUIs development using ICEfaces, JSF, JSP, HTML, DHTML, CSS.

TECHNICAL SKILLS

Big Data: Hadoop, HDFS, MapReduce, Apache Pig, Sqoop, Hive, Flume, ZooKeeper, Clusters, Oozie, Ambari, Big Analytics, MPP, Cloudera’s CDH4, Hortonworks, HBase.

Application Development Tools: TOAD 9.7/9.0, SQL Developer, SQL* PLUS, Eclipse- Kepler IDE

Languages: Java, SQL, PL/SQL, C, C++

Data Modeling Tools: ERWIN, ETL, Visio, Microsoft Office Suite.

Relational Databases: Oracle 11g/10g, SQL Server 2008, MySQL 5.6.2

Operating Systems: Windows vista/XP/2003/2000/NT 4.0/9x, MS-DOS

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Hadoop Developer\Data Analyst

Responsibilities:

  • Joined data from different files by creating MapReduce programs to generate Avro file by writing Avro schema.
  • Developed MapReduce programs to process the Avro files and to get the results by performing some calculations on data. And performed Map side joins and other operations.
  • Supported team while validating the results of MapReduce codes.
  • Created a Oozie workflow to transfer data from Netezza DataBases to HDFS environment by using Sqoop action.
  • Transferred the result from Hive to PostgreSQL DataBases using Sqoop thereby allowing the downstream to use the consistent data.
  • Worked on developing an Oozie workflow to run the MapReduce jobs parallel in mode by using fork and join nodes.
  • Coordinated with Hadoop Administrator in implementing Hadoop security using Kerberos authentication.
  • Stored the data in the tabular formats using the Hive tables.
  • Big data analysis using HIVE, Pig and User defined functions (UDFs).
  • Created a Hive tables to facilitate joins and used Pig to store the results.
  • Wrote JUnit test cases to test and debug MapReduce programs in local machine.
  • Scheduled Oozie workflow engine to run multiple MapReduce,Hive and Pigjobs, which independently run with data availability.
  • Created Hive tables as per requirement that were internal or external tables defined with proper static and dynamic partitions, intended for efficiency.
  • Monitored the Hadoop scripts which take the input from HDFS and load the data into Hive.

Environment: Hadoop, Hive, Pig, Sqoop, Flume, Oozie, Yarn, Hue, ZooKeeper, UNIX, Shell Scripting, XML, XSLT, HDFS, MapReduce, HBase, AWS.

Confidential, Houston, TX

Hadoop Developer\Data Analyst

Responsibilities:

  • Joined data from logs by creating MapReduce programs to analyze the properties that are most talked about, the competitors for these lease and also what leases are talking about.
  • Created a Hive tables to facilitate joins and used Pig to store the results.
  • Transferred the result from Hive to Oracle using Sqoop thereby allowing the downstream to use the consistent data.
  • Transferred data from DataBases to HDFS using Sqoop.
  • Used Flume to stream through the log data from various sources.
  • Stored the data in the tabular formats using the Hive tables.
  • Wrote data ingesters and MapReduce programs.
  • Involved in moving all log files generated from various sources to HDFS for further processing.
  • Processed the HDFS data creating and using the Apache Pig scripts.
  • Monitored the Hadoop scripts which take the input from HDFS and load the data into Hive.
  • Retrieved the booking data from the data loaded in HDFS using Sqoop.
  • Used Hive Serdes to analyze JSON.
  • Developed MapReduce programs to clean and parse the data in HDFS that is obtained from various data sources.
  • Created Hive tables as per requirement that were internal or external tables defined with proper static and dynamic partitions, intended for efficiency.
  • Big data analysis using HIVE, Pig and User defined functions (UDFs).
  • Coordinated with Hadoop Administrator in implementing Hadoop security using Kerberos authentication.
  • Performed Map side joins and other operations using MapReduce.

Environment: Hadoop, Hive, Pig, Sqoop, Flume, Oozie, Yarn, Hue, ZooKeeper, UNIX, Shell Scripting, XML, XSLT, HDFS, MapReduce, HBase.

Confidential, McLean, VA

Hadoop Developer

Responsibilities:

  • Involved in the implementation of Hadoop Cluster and Hive for Development and Test Environment.
  • Used Sqoop to load existing data warehouse data from Oracle database to HDFS.
  • Developed MapReduce programs in Java for searching the production logs and web analytics logs for use cases such as application issues and measure page download performance respectively.
  • Provided SQL like access to Hadoop data by loading the data into the Hive tables form HDFS.
  • Used Hive/HQL queries to provide Adhoc-reports for data in Hive tables in HDFS.
  • Involved in admin related issues of Hbase and other NoSql databases.
  • Good understating of the core java concepts and implementing them writing MapReduce programs.
  • Good knowledge in tableaux and tidal enterprise scheduler.
  • Experience in Apache Tarball Installations.
  • Experience in creating Linux users and groups.
  • Involved and actively interacted with cross-functional teams like Web Team, Unix and DBA teams for successful Hadoop implementation.
  • Extensively involved in the User Training of Hadoop system for cross-functional teams.
  • Assigned work load to the offshore team members and made them understand the requirements to ensure that the work is delivered as per schedules to clients.

Environment: Crystal reports XIR2, SQL server 2005 and 2008, Visual Studio 2005 and 2008, SSIS, SSRS (Adhoc Reporting).

Confidential - Minneapolis, MN

Java/J2EE Developer

Responsibilities:

  • Used Object Oriented Design (OOD).
  • Developed the application using J2EE Design Patterns like Delegate, Singleton, Service Locator, DAO, Composite entity, Service activator and command pattern.
  • Used Struts MVC at presentation layer.
  • Integrated spring and Ext-JS layer.
  • Implemented Transaction management using Aspect Oriented Programming (AOP), Advice and Advisor APIs.
  • Developed front-end content using JSP, Servlets, DHTML, JavaScript, CSS, HTML and JSTL.
  • Used Hibernate in the persistence layer to implement DAOs to access data from database.
  • Involved in writing HQL queries for implementing the persistence through hibernate
  • Created Data source for interaction with Database.
  • Developed SQL stored procedures and prepared statements for updating and accessing data from database.
  • Experience in implementation of Internationalization (i18n) in web applications with Struts.
  • Involved in integration of layers (UI, Business & DB access layers).
  • Involved in developing technical specification documents and UML diagrams using UML and Rational Rose.
  • Implemented the entire financial functionality in the O2 application.
  • Communication to Capacity Analysis system using JMS messaging system through MQSerie.
  • Developed, implemented, and maintained an asynchronous, AJAX based rich client for improved customer experience.
  • Monitored the error logs using Log4J and fixed the problems.
  • Used Struts Validation Framework to validate the UI components.
  • Developed the Action classes and Form Beans.
  • Used Agile Methodology.
  • Managed the Struts Configuration file for the Navigation.
  • Developed XML Converter classes based on JDOM and XPATH and XML technologies to obtain and persist data.
  • Associated with the full life cycle of the web based application implementation.

Environment: Spring, Struts2.0, JPA (Hibernate), Servlets, JSP, DHTML, JavaScript, UML, JQuery, Web Services, HTML, CSS, JSTL, WebSphere App server, NetBeans5.5, Java1.5, J2EE, Ext-JS (Ajax Component Library),MQ, Clear Case, SQL, JUnit, Ant, Ajax,Log4J, Oracle 9i.

Confidential, Miami, FL

Java/J2EE Developer

Responsibilities:

  • Involved in the system analysis and design based on the technical specifications
  • Took the responsibility of implementing the Excel File Uploads and Downloads.
  • Involved in developing presentation layer using Struts, HTML/JS, AJAX, and DOJO also advanced technology like GWT Window Manager.
  • Developed the Struts components such as Actions and Action Forms required for the application and responsible for code review.
  • Developed, tested and deployed successfully on Sun Solaris9, UNIX servers.
  • Preparing test cases, unit test plan, functional documents and technical documents
  • Performing the test cases in development environment and bug fixing.
  • Involved in managing and creating relevant tables in the database.
  • Implemented logger using Log4j, for debugging and testing purpose.
  • Also worked with the JVM Logs.
  • Developed ANT build scripts to compile, package and deploy the application.
  • Supporting unit and functional testing for modules using JUnit and bug fixing.
  • Coordinated with Quality control team to fix the system and performance issues identified.
  • Supported QA team to perform functional and system testing in SIT and UAT environments.
  • IResponsible for maintaining the Code Coverage factor in the complete development phase.
  • Took ownership for adjusting to PQM Standards of application code.
  • Participated actively as a Defect Prevention team member.
  • Involved in conducting the technical trainings on the project to QA.
  • Responsible for demonstrating the tool to the users and prepared the user documentation

Environment: Java,J2EE, Websphere MQ, Clear Case, SQL, JUnit, Ant, Ajax,Log4J, Oracle 9i.

Confidential

Software Engineer

Responsibilities:

  • Developed the design solutions by gathering the requirements.
  • Created sequence diagrams, collaboration diagrams, class diagrams, use cases and activity diagrams using Rational Rose for the Configuration, Cache & logging Services.
  • Designed and developed the project using MVC design pattern.
  • Developed front end using Struts framework, business components using EJBs.
  • Used JSP, JavaScript, JSTL, EL, Custom Tag libraries, Tiles and Validations provided by struts framework.
  • Implemented Struts/Tiles based framework to present the data to the user.
  • Created the web UI using Struts, JSP, Servlets and Custom tags.
  • Deployment of Application in the JBoss Server environment using ANT tool using Struts architecture with the JSP Client.
  • Configured Struts, DynaActionForm, MessageResources, ActionMessages, ActionErrors, Validation.xml, and Validator-rules.xml.
  • Used Singleton pattern and Log4j for designing and developing the Caching and Logging services.
  • Used XML for coding different action classes in struts responsible for maintaining deployment descriptors like struts-config, ejb-jar and web.xml.
  • Developed and deployed Session Beans and Entity Beans for database updates.
  • Used Shell, Perl and ANT scripts for writing build & deployment scripts.

Environment: Java, JSP, Servlets, Struts, XML, EJB 2.0, WebServices, SOAP, JNDI, J2EE, Eclipse, JBoss, CVS, Oracle8i, Junit, ANT,JavaScript, DHTML.

We'd love your feedback!