We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

Cary, NC

PROFESSIONAL SUMMARY:

  • Hadoop Developer with over 7 years of professional IT experience which includes experience in the Big Data ecosystem related technologies.
  • Excellent Knowledge in understanding Big Data infrastructure, distributed file systems - HDFS, parallel processing - Map Reduce framework and complete Hadoop ecosystem - Hive, Pig, Sqoop, Oozie and Flume.
  • Experience in installation, configuration, supporting and managing - Cloudera’s Hadoop platform along with CDH3&CDH4 clusters.
  • Have hands on experience in writing Map Reduce jobs on Hadoop Ecosystem including Hive and Pig.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems/ Non-Relational Database Systems and vice-versa.
  • Knowledge in job/workflow scheduling and monitoring tools like Oozie and Zookeeper.
  • Worked on Apache Hadoop Open source Distribution.
  • Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning, custom configurations, monitoring and maintaining using different Hadoop distributions: Cloudera CDH, Apache Hadoop.
  • Experience in developing ETL process using Map-Reduce Framework in java.
  • Excellent understanding of Apache Crunch to develop data pipelines that ingests data from multiple data sources and process them.
  • Expertise in Implementing SOLR index cron jobs.
  • Experience with NoSQL databases like HBase.
  • Collaborating with business users/product owners/developers to contribute to the analysis of functional requirements.
  • Working Knowledge in architecting Hadoop solutions including hardware recommendations, network topology design, storage configurations, benchmarking, performance tuning, administration and support.
  • Proficient in design and development of various dashboards, reports utilizing Tableau Visualizations like Dual Axis, Bar Graphs, Scatter Plots, Pie-Charts, Heat Maps, Bubble Charts, Tree Maps, Funnel Charts, Box Plots, Waterfall Charts, Geographic Visualization and other making use of actions, other local and global filters according to the end user requirement.
  • Expertise in designing and creating various analytical reports and Automated Dashboards to help users to identify critical KPIs and facilitate strategic planning in the organization.
  • Strong understanding of Data warehouse concepts, ETL, data modeling experience using Normalization, Business Process Analysis, Reengineering, Dimensional Data modeling, physical & logical data modeling.
  • Experience in working with different relational databases like MySQL, MS SQL and Oracle.
  • Experience in Database design, Entity relationships, Database analysis, Programming SQL, Stored procedure’s PL/ SQL, Packages and Triggers in Oracle and SQL Server on Windows and LINUX.
  • Detailed knowledge and experience of Design, Development and Testing Software solutions using Java and J2EE technologies.
  • Expertise in various faces of Software Development including analysis, design, development and deployment of applications using Servlets, JSP, Java Beans, EJB, JSTL, JMS, Struts, Spring Framework, JSF, JDBC and Hibernate.
  • Expertise in developing and maintaining the Web Applications using the Web server Tomcat.
  • Experience with front end technologies like HTML, CSS and Javascript.
  • Strong analytical skills with ability to quickly understand clients business needs.
  • Involved in meetings to gather information and requirements from the clients. Leading the Team and involved in Onsite, Offshore co-ordination.
  • Very active participant in Hadoop user groups and Strata Big data conferences.

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Flume, Zookeeper, Cloudera

NOSQL Databases: Hbase

BI Tools: Tableau, Base SAS, SAS Enterprise Guide, SAS Enterprise Miner, IBM Congos

Programming Languages: Java, C, C++, Python

Web Technologies: HTML, J2EE, CSS, JavaScript, Servlets, JSP, DOM, XML, XSLT, XPATH

Java Framework: Struts, spring, Hibernate

Databases: MySQL, SQL, Oracle, SQL Server, Microsoft Excel

Software Engineering: UML, Object Oriented Methodologies, Scrum and Agile methodologies

Operating System: Linux, Windows 7, Windows 8, XP, windows vista

Work Environments: Eclipse, Visual Studio .NET, JUnit, Log4j, Putty

PROFESSIONAL EXPERIENCE:

Confidential, Cary, NC

Sr. Hadoop Developer

Responsibilities:

  • Installed, configured and maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
  • Installed and configured Hadoop, Map Reduce, HDFS (Hadoop Distributed File System), developed multiple MapReduce jobs in java for data cleaning.
  • Worked on installing cluster, commissioning & decommissioning of DataNodes, NameNode recovery, capacity planning, and slots configuration.
  • Implemented NameNode backup using NFS for High availability.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS.
  • Responsible for developing data pipeline using flume, Sqoop and pig to extract the data from weblogs and store in HDFS.
  • Installed Oozie workflow engine to run multiple Hive and Pig Jobs.
  • Used Sqoop to import and export data from HDFS to RDBMS and vice-versa.
  • Created Hive tables and involved in data loading and writing Hive UDFs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Worked on HBase.
  • Automated workflows using shell scripts to pull data from various databases into Hadoop.
  • Responsible for creating a Solr schema from the Indexer settings
  • Written SOLR queries for various search documents
  • Deployed Hadoop Cluster in Fully Distributed and Pseudo-distributed modes.

Environment: Hadoop, MapReduce, Hive, HDFS, PIG, Sqoop, Oozie, Solr, Cloudera, Flume, HBase, ZooKeeper, Oracle, NoSQL and Unix/Linux.

Confidential, Kansas City, MO

Hadoop Developer

Responsibilities:

  • Worked on Big Data Hadoop cluster implementation and data integration in developing large-scale system software.
  • Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on Centos. Assisted with performance tuning and monitoring.
  • Assessed existing and EDW technologies and methods to ensure our EDW/BI architecture meet the needs of the business and enterprise and allows for business growth.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Capturing data from existing databases that provide SQL interfaces using Sqoop.
  • Worked extensively with Sqoop for importing and exporting the data from HDFS to Relational Database systems/mainframe and vice-versa. Loading data into HDFS.
  • Develop and maintains complex outbound notification applications that run on custom architectures, using diverse technologies including Core Java, J2EE, SOAP, XML, JMS and Web Services.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Developed Hive queries for the analysts.
  • Helped business processes by developing, installing and configuring Hadoop ecosystem components that moved data from individual servers to HDFS.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Supported code/design analysis, strategy development and project planning.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Assisted with data capacity planning and node forecasting.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
  • Administrator for Pig, Hive and installing updates, patches and upgrades.
  • Handling structured and unstructured data and applying ETL processes.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Coding complex Oracle stored procedures, functions, packages, and cursors for the client specific applications.
  • Production Rollout Support and resolving any issues that are discovered by the client and client services teams.

Environment: Hadoop, Map Reduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of Hortonworks, Cloudera, MapR, DataStax, IBM DataStage 8.1(Designer, Director, Administrator), PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.

Confidential, Costa Mesa, CA

Senior J2EE & Hadoop Developer

Responsibilities:

  • Involved in designing and developing Hadoop MapReduce jobs Using JAVA Runtime Environment for the batch processing to search and match the scores.
  • Used Rational Rose for developing Use case diagrams, Activity flow diagrams, Class diagrams and Object diagrams in the design phase.
  • Used Struts with Tiles in the MVC framework for the application.
  • Extensively worked on Servlets, JSPs, Struts 1.3 and Tiles, JavaScript, Expression language, JSTL, JSP custom tags.
  • Involved in using Solr Cloud implementation to provide real time search capabilities on the repository with terabytes of data.
  • Involved in developing Hadoop MapReduce jobs for merging and appending the repository data.
  • Hands on experience in setting up H-base Column based storage repository for archiving and retro data.
  • Developed XML schema and DOM parser for all the XML document used for Data Transfers and also developed XSLT code for them.
  • Configured Struts-config.xml, ejb-jar.xml and web.xml on the application.
  • Used Apache CXF web service stack for developing web services and SOAP UI and XML-SPY for testing web services.
  • Used MyEclipse 6.0.1 IDE for application development.
  • Used Hibernate 3.0 in data access layer to access and update information in the database.
  • Used Java Message Service (JMS 1.1) for reliable and asynchronous exchange of important
  • Involved in agile SCRUM methodology implementation.
  • Involved in various performance projects to increase the response time of the application.
  • Involved in integration of Legacy Scoring and Analytical Models like SMG3 into the new application using Web Services.
  • Involved in development of batch processing application using Multi-threaded executor pools for faster processing.
  • Responsible for writing Pig UDFs and Hive UDFs.
  • Handled importing of data from various data sources, performance transformation using Hive.
  • Experience in optimization of Map reduce algorithm using combiners and partitions to deliver the best results and worked on Application performance optimization for a HDFS cluster.
  • Created various calculated fields and created various visualizations and dashboards using tableau desktop.
  • Published the dashboards created on Tableau desktop onto Tableau server.
  • Used Log4j for logging and debugging and used JUnit extensively for testing.
  • Handling the scalability tool for the framework.
  • Code refactoring to optimize the calls to various system components.
  • Experience working with off-shore teams and communicating daily status on issues, road-blocks.

Environment: Java, J2EE, Tableau Desktop, Tableau Server, Hadoop, Hbase, Kettle, Zookeeper, Solr cloud, Pig Latin, Oozie scheduler, JavaBeans, Agile SCRUM, IBM Data Power, JProfiler, Spring, Struts1.3, Hibernate3.0, Jboss Application Server, Eclipse, Rational Clear case, CXF 2.2.4, JNDI, Java Script, Servlet 2.3, JUnit, Maven, SVN, Jboss, XML Web services, HTML DB2, JDBC, ANT, UML, Unix, Windows NT/2000.

Confidential, Webster, MA

JAVA Developer

Responsibilities:

  • Involved in the analysis, design, and development and testing phases of Software Development Life Cycle (SDLC).
  • Used Rational Rose for developing Use case diagrams, Activity flow diagrams, Class diagrams and Object diagrams in the design phase.
  • Analysis, design and development of Application based on J2EE using Struts and Tiles, Spring 2.0 and Hibernate 3.0.
  • Involved in interacting with the Business Analyst and Architect during the Sprint Planning Sessions.
  • Used XML Web Services for transferring data between different applications.
  • Used Apache CXF web service stack for developing web services and SOAP UI and XML-SPY for testing web services.
  • Used JaxB for binding XML to Java. Used SAX and DOM parsers to parse xml data. Used Xpath to parse XML documents
  • Hibernate was used for Object Relational mapping with Oracle database.
  • Worked with Spring IOC for injecting the beans and reduced the coupling between the classes.
  • Involved in developing the user interface using Struts.
  • Implemented Spring IOC (Inversion of Control)/DI (Dependency Injection) for wiring the object dependencies across the application.
  • Integrated spring and Hibernate ORM framework for persistence and used HibernateDaoSupport with Hibernate Template to access the data.
  • Implemented spring transaction management for implementing transaction's for the application.
  • Implemented design patterns for Service Locator.
  • Performed unit testing using Junit 3, EasyMock Testing Framework for performing Unit testing.
  • Worked on PL/SQL stored procedures using PL/SQL Developer.
  • Involved in Fixing the production Defects for the application.
  • Used Eclipse as IDE for application development.
  • Used ANT as build-tool for building J2EE applications.
  • Used Tomcat 5.5 for application deployment.
  • Participated in SCRUM software development process as part of agile software development methodology.

Environment: Java 1.6, Struts, PL/SQL, Spring IOC, Spring Transaction Management, Hibernate 3.0, Springs2.0 JSP 2.0, Oracle 11g, Eclipse, JUnit 3, PL/SQL Developer, Application Server, JDBC, Maven, CVS, Harvest, UML Struts 1.2.3, XML Web Services.

Confidential, Stamford, CT

Software Engineer

Responsibilities:

  • Involved in the Design, Coding, Testing and Implementation of the web application.
  • Developed JSP Java Server Pages starting from HTMLs and detailed technical design specification documents. Pages included HTML, CSS, JavaScript, Hibernate and JSTL.
  • Developed SOAP based requests for communicating with Web Services.
  • Used agile systems and strategies to provide quick and feasible solutions, based on agile system, to the organization.
  • Implemented HTTP Modules for different applications in Struts Framework that uses Servlets, JSP, ActionForm, ActionClass and ActionMapping.
  • Developing web applications using MVC Framework, spring, Struts, Hibernate.
  • Involved in the creation of custom interceptors for Validation purposes.
  • Analyzed and fixed defects in the Login application.
  • Involved in dynamic creation of error elements on demand when there is an error.
  • Involved in Ajax - based Rich Browser User Interfaces.
  • Ensured design consistency with client’s development standards and guidelines.
  • Improved user experience by designing and creating new web components and features.

Environment: Java, J2EE, Struts, SOAP web services, SOA, Spring, Hibernate, JavaScript, jQuery, Oracle, AJAX, JSP, Servlets, Eclipse, CVS Source control, Linux.

Hire Now