We provide IT Staff Augmentation Services!

Sr. Big Data/ Hadoop Developer Resume

0/5 (Submit Your Rating)

Farmington Hills, MI

SUMMARY

  • Overall 8 years of experience in Information Technology which includes experience in Bigdata, HADOOP Ecosystem, developing java code and strong in Design, Software processes, Requirement gathering, Analysis and development of software applications in the roles of Programmer Analyst,BigDataDeveloper, ETL Expert.
  • Excellent Hands on Experience in developing Hadoop Architecture in Windows and Linux platforms.
  • Experience in building BigData solutions using Lambda Architecture using Cloudera distribution of Hadoop, Twitter Storm, Trident, MapReduce, Cascading, HIVE, PIG and Sqoop.
  • Participated in building CDH4 test cluster for implementing Kerberos authentication.
  • Also experience in building Cloudera distribution of Hadoop with Knox gateway and apache Ranger.
  • Experience in analyzing and recommendingBigDatasolutions.
  • Experience in analyzing marketing requirements and translated them into technical Specifications.
  • Experience in capacity planning, hardware recommendations, performance tuning and benchmarking.
  • Experience working both independently and collaboratively to solve problems and deliver high quality results in a fast - paced, unstructured environment.
  • Good experience in optimizing Map Reduce algorithms using Mappers, Reducers, combiners and partitioners to deliver the best results for the large datasets.
  • Set up standards and processes for Hadoop based application design and implementation.
  • Performed data analytics using PIG and data warehousing concepts using Hive within the team. Extending HIVE and PIG core functionality by using custom UDF's.
  • Performed data copy from legacy RDBMS systems using Sqoop
  • Maintained list of source systems and data copies, tools used in data ingestion, landing location in Hadoop.
  • Good Knowledge on Data capacity and node forecasting and planning
  • Good Knowledge of data compression formats Snappy or Avro.
  • Implemented various Pentaho Data Integration steps in cleansing and load the data per the business needs.
  • Good experience in configuration of Pentaho Data integration server to run the jobs in local, remote server and cluster mode.
  • Performed platform and component level application updates, patches and upgrades.
  • Strong Work experience in Enterprise Financial Transactions and Money movement activities.
  • Dealt with huge transaction volumes while interfacing the front end application written in Java, JSP, Struts, Webworks, Spring, JSF, Hibernate, Web service and EJB with Web sphere Application server and Jboss.
  • Experience in NOSQL Databases concepts and Hbase and MongoDB.
  • Working knowledge of Node.js and Express JavaScript Framework.
  • Exposure to Android & IPhone Mobile application Development.
  • Over 5 years of experience in design and development of various web applications with n-tier Architecture using MVC and J2EE Architecture techniques.
  • Strong working experience using Agile methodologies including Scrum and Kanabn.
  • Experience in Test Driven Development (TDD), Mocking Frameworks, and Continuous Integration (Hudson & Jenkins)
  • Delivered zero defect code for three large projects which involved changes to both front end (Java, Presentation services) and back-end (DB2).
  • Proactively suggested tactical solution for Quiet Time Alerts project by utilizing the Message broker Architecture. This saved 400 man hours of initial project estimation.
  • Strong experience in designing Message Flows and writing complex ESQL scripts and invoked Web service through message flow.
  • Designed and developed a Batch Framework similar to Spring Batch framework.
  • Expertise in using Design and Architectural patterns.
  • Exposure to PCI, HIPAA compliance policies and HL7 standards using Mith Connect.

TECHNICAL SKILLS

Big Data: Hadoop, Storm, Trident, Hbase, Hive, Flume, Cassandra, Sqoop, Oozie, PIG, Mapreduce, Zookeeper, Yarn.

Operating Systems: UNIX, Mac, Linux, Windows 2000 / NT / XP / Vista, Android

Programming Languages: Java JDK1.4/1.5/1.6 (JDK 5/JDK 6), C/C++, Mat lab, R, HTML, SQL, PL/SQL.

Frameworks: Hibernate 2.x/3.x, Spring 2.x/3.x,Struts 1.x/2.x and JPA.

Web Services: WSDL, SOAP, Apache CXF/XFire, Apache Axis, REST, Jersey.

Databases/technologies: Oracle 8i/9i/10g, Microsoft SQL Server, DB2 & MySQL 4.x/5.x.

Middleware Technologies: Web sphere Message Queue, Web sphere Message Broker, XML gateway, JMS.

Web Technologies: J2EE, Soap & REST Web Services, JSP, Servlets, EJB, JavaScript, Struts, Spring, Web works, Direct Web Remoting, HTML, XML,JMS, JSF, Ajax.

Testing Frameworks: Mockito, PowerMock, EasyMock

Web/Application Servers: IBM Web sphere Application server, Jboss, Apache Tomcat.

Others: SoftwareBorland Star team, Clear case, Junit, ANT, Maven, Android Platform, Microsoft Office, SQL Developer, DB2 control center, Microsoft Visio, Hudson, Subversion, GIT, Nexus, Artifactory and Trac.

Development Strategies: Agile, Lean Agile, Pair Programming, Water-Fall and Test Driven Development.

PROFESSIONAL EXPERIENCE

Confidential, Farmington Hills, MI

Sr. Big data/ Hadoop Developer

Responsibilities:

  • Gathered the business requirements from the Business Partners and Subject Matter Experts.
  • Worked with Data Modeler and DBAs to build the data model and table structures. Actively participated in discussion sessions to design the ETL job flow.
  • Worked with 10+ source systems and got batch files from heterogeneous systems like Unix/windows/oracle/mainframe/db2.
  • Handled 20 TB of data volume with 10 Node cluster in Test environment.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Used to manage and review the Hadoop log files.
  • Supported Hbase Architecture Design with the Hadoop Architect team to develop a Database Design in HDFS.
  • Supported Map Reduce Programs those are running on the cluster and also wrote MapReduce jobs using Java API.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • Imported data from mainframe dataset to HDFS using Sqoop.
  • Also handled importing of data from various data sources (i.e. Oracle, DB2, Cassandra, and MongoDB) to Hadoop, performed transformations using Hive, MapReduce.
  • Used Hive data warehouse tool to analyze the data in HDFS and developed Hive queries.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Wrote Pig Latin scripts and also developed UDFs for Pig Data Analysis.
  • Involved in managing and reviewing Hadoop log files.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers with regular code review sessions.
  • Installed and configured Pentaho for performing data integration.
  • Involved in transforming data from Mainframe tables to HDFS, and HBASE tables using Sqoop and PentahoKettle.
  • Participated in building CDH4 test cluster for implementing Kerberos authentication.
  • Upgraded the Hadoop Cluster from CDH4 to CDH5 and setup High availability Cluster to Integrate the HIVE with existing applications
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Updated maps, sessions and workflows as a part of ETL change and also modified existing ETL Code and document the changes.
  • Exported the result set from HIVE to MySQL using Kettle (Pentahodata-integration tool).
  • UsingPentahogenerated the reports which are consumed by the business analysts.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Familiar with Scala, closures, higher order functions, monads.

Environment: Hadoop, Java, MapReduce, HDFS, Hive, Pig, Linux, XML, Eclipse, Cloudera, CDH4/5 Distribution, Pentaho Cassandra 2.0.5/11, Kettle, Pentaho Bigdata, DB2, SQL Server, Oracle 11i, MySQL.

Confidential, San Antonio, TX

Big data /Hadoop Developer

Responsibilities:

  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).
  • Maintained the Hadoop cluster using Knox gateway and apache Ranger. Integrated the hive warehouse with HBase
  • Migrating the needed data from MySQL into HDFS using Sqoop and importing various formats of flat files into HDFS.
  • Load the data into HBase tables for UI web application.
  • Written customized Hive UDFs in Java where the functionality is too complex.
  • Maintain System integrity of all sub-components related to Hadoop.
  • Used different concepts of data warehousing like designed and created Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • HiveQL scripts to create, load, and query tables in a Hive.
  • Worked with HiveQL on big data of logs to perform a trend analysis of user behavior on various online modules.
  • Supported Map Reduce Programs those are running on the cluster.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
  • Worked on Big Data Integration and Analytics based on Hadoop, SOLR, Spark, Kafka, Storm and web Methods technologies.
  • UsedPentahoData Integration in Hortonworks environment to move the data from different web services and REST API (JSON/XML) intoHadoopData Lake.
  • Performed installation, configuration ofPentaho.
  • Implemented Spark using Scala and Spark SQL for faster testing and processing of data.
  • Real time streaming the data using Spark with Kafka.
  • Worked on migrating MapReduce programs into Spark transformations using Spark and Scala
  • Generate final reporting data using Tableau for testing by connecting to the corresponding Hive tables using Hive ODBC connector.
  • Strongly recommended to bring in Elastic Search and was responsible for installing, configuring and administration.
  • Developing and maintaining efficient ETL Talend jobs for Data Ingest.
  • Worked on Talend RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite.
  • Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
  • Involved in migration Hadoop jobs into higher environments like SIT, UAT and Prod.

Environment: Hortonworks Hadoop2.3, HDFS, Hive, HQL scripts, Scala, Map Reduce, Storm, Java, HBase, Pig, Sqoop, Shell Scripts, Oozie Co-ordinator, MySQL, Tableau, Pentaho Kettle, Elastic search, Talend and SFTP.

Confidential, Irving, TX

Hadoop / Java Developer

Responsibilities:

  • Re-architected all the applications to utilize the latest infrastructure in a span of three months and helped the developers to implement successfully.
  • Designed the Hadoop jobs to create the product recommendation using collaborative filtering.
  • Designed the COSA pretest utility Framework using JSF MVC, JSF Validation, Tag library and JSF Baking beans.
  • Integrated the Order Capture system with Sterling OMS using JSON Webservice
  • Configured the ESB to transform the Order capture XML to Sterling message.
  • Configured and Implemented Jenkins, Maven and Nexus for continuous integration.
  • Mentored and implemented the test driven development (TDD) strategies.
  • Loaded the data from Oracle to HDFS (Hadoop) using Sqoop
  • Developed the Data transformation script using hive and MapReduce
  • Designed and developed User Defined Function (UDF) for Hive (java)
  • Loading the data to HBASE using bulk load and HBASE API.
  • Designed and implemented the Open API using Spring REST webservice.
  • Proposed the integration pipeline testing strategy-using cargo.

Environment: Java, JSP, Spring, JSF, Rest Webservice, AspectJ, InteliJ, Weblogic, Subversion, Git, Jenkins, Nexus, Jquery, Oracle, Mockito, PowerMock, Hadoop, Sqoop, Hbase, Hive, Sterling OMS, TDD and Agile.

Confidential

Java Developer

Responsibilities:

  • Designed and Coded the appeals module to replicate the declined credit applications to further proceed for approval with the necessary changes using Spring, JSF and Hibernate.
  • Designed the Comparison utility to compare the Credit applications to check whether reprising is required.
  • Created the generic castor mappings and applied XSLT to generate the SOAP request, to invoke the middle-ware services.
  • Developed the pre and post request interceptors to perform the basic Database operations.
  • Developed asset level and contract level, short Funding modules and integrated with the existing modules
  • Designed and developed the Cost modules for Assets and Properties.
  • UsedRESTClient add-on in order to do unit testing for the web services.
  • Generated the contract documents using the IText.
  • Involved in writing SQL/PLSQL queries to retrieve the data from the Oracle Database.
  • Developed presentation layer (view) components using JSP and HTML to display the Order confirmation.
  • Codes for creating a template of a Dashboard.
  • Involved in developing codes for a script editor in DBM module.
  • Very good working knowledge of developing complex SQL on various databases like DB2 and SQL server.

Environment: Java, JSP, Web services, Hibernate, Log4j, Eclipse, Subversion, Spring, REST, JSF, XSLT, Oracle, TOAD, Water Fall, PL/SQL, SOAP.

Confidential

Software Developer

Responsibilities:

  • Designed, developed & deployed the front-end dynamic web pages using JSP, JavaScript, HTML, CSS and AJAX.
  • Used Jakarta Struts Framework to implement Model View Controller (MVC) architecture to promote loose coupling and make the application more scalable in future.
  • Used Struts tag libraries extensively in the presentation layer.
  • Created Session Beans and Entity Beans for implementing the Business Logic.
  • Involved in performing server side validation using Struts Validation framework.
  • Involved in development ofRESTbased web services using spring web services module.
  • Tested the web services using SOAPUI andRESTclient tools.
  • Consumed web services from other vendors to retrieve information using SpringRestClient.
  • Involved in the development of Tables, Stored procedures, Database Triggers and Functions to retrieve/update the data from/to the database.
  • Actively involved in Testing and Deployment of the application on Web-Logic Application server.
  • Used JUnit to do the unit testing.
  • CVS source configuration management was used to maintain the code in the organization.

Environment: J2EE (JSP, Servlets, Hibernate, Struts Framework), JavaScript, Web logic, Oracle, MySQL, XML, Ajax, MVC, REST, SOAPUI, jUnit

We'd love your feedback!