We provide IT Staff Augmentation Services!

Sr.hadoop/scala Developer Resume

4.00/5 (Submit Your Rating)

Plano, TX

PROFESSIONAL SUMMARY:

  • Having 8+ years of experience in IT which includes Analysis, Design, Development of Big Data using Hadoop, design and development of web applications using JAVA, J2EE and data base and data warehousing development using My SQL, Oracle and Informatica.
  • Around 5+ years of work experience on Big Data Analytics wif hands on experience in installing, configuring and using ecosystem components like Hadoop Map reduce, HDFS, HBase, Zookeeper, Hive, Sqoop, Pig, Flume, Cassandra, Kafka and Spark.
  • Good Understanding of Hadoop architecture and Hands on experience wif Hadoop components such as Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce concepts and HDFS Framework.
  • Experience in using Cloudera Manager for installation and management of single - node and multi-node Hadoop cluster (CDH4&CDH5).
  • Experience in Data load management, importing & exporting data using SQOOP & FLUME.
  • Experience in analyzing data using Hive, Pig and custom MR programs in Java.
  • Experience in scheduling and monitoring jobs using Oozie and Zookeeper.
  • Experienced in writing Map Reduce programs & UDF's for both Pig & Hive in java.
  • Experience in dealing wif log files to extract data and to copy into HDFS using flume.
  • Developed Hadoop test classes using MR unit for checking Input and Output.
  • Experience in integrating Hive and Hbase for TEMPeffective operations.
  • Developed teh Pig UDF'S to pre-process teh data for analysis.
  • Experience in Impala, Solr, MongoDB, HBase and Spark.
  • Hands on noledge of writing code in Scala.
  • Proficient in Core Java, J2EE, JDBC, Servlets, JSP, Exception Handling, Multithreading, EJB, XML, HTML5, CSS3, JavaScript, AngularJS.
  • Used source debuggers and visual development environments.
  • Experience in Testing and documenting software for client applications.
  • Writing code to create single-threaded, multi-threaded or user interface event driven applications, either stand-alone and those which access servers or services.
  • Good experience in object oriented design (OOPS) concepts.
  • Good experience in using Data Modelling techniques to find teh results based on SQL and PL/SQL queries.
  • Good working noledge on Spring Framework.
  • Strong Experience in writing SQL queries.
  • Experience working wif different databases, such as Oracle, SQL Server, MySQL and writing stored procedures, functions, joins, and triggers for different Data Models.
  • Expertise in implementing Service Oriented Architectures (SOA) wif XML based Web Services (SOAP/REST).

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Oozie, Hadoop distribution, and Hbase,Spark

Programming Languages: Java (5, 6, 7),Python,Scala

Databases/RDBMS: MySQL, SQL/PL-SQL, MS-SQL Server 2005, Oracle 9i/10g/11g

Scripting/ Web Languages: JavaScript, HTML5, CSS3, XML, SQL, Shell

ETL Tools: Cassandra, HBASE,ELASTIC SEARCH

Operating Systems: Linux, Windows XP/7/8

Software Life Cycles: SDLC, Waterfall and Agile models

Office Tools: MS-Office,MS-Project and Risk Analysis tools, Visio

Utilities/Tools: Eclipse, Tomcat, NetBeans, JUnit, SQL, SVN, Log4j, SOAP UI, ANT, Maven, Automation and MR-Unit

Cloud Platforms: Amazon EC2

PROFESSIONAL EXPERIENCE:

Confidential, Plano, TX

Sr.Hadoop/Scala Developer

Responsibilities:

  • Developing UDFs in java for hive and pig and worked on reading multiple data formats on HDFS using Scala.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala.
  • Developed multiple POCs using Scala and deployed on teh Yarn cluster, compared teh performance of Spark, wif Hive and SQL/Teradata.
  • Analysed teh SQL scripts and designed teh solution to implement using Scala.
  • Developed analytical component using Scala, Spark and Spark Stream.
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis
  • Developed Scripts and automated data management from end to end and sync up between all teh clusters.
  • Involved in creating Hive Tables, loading wif data and writing Hive queries which will invoke and run MapReduce jobs in teh backend.
  • Involved in migration from Livelink to Sharepoint using Scala through Restful web service.
  • Extensively involved in developing Restful API using JSON library of Play framework.
  • Used Scala collection framework to store and process teh complex consumer information.
  • Used Scala functional programming concepts to develop business logic.
  • Designed and implemented Apache Spark Application (Cloudera)
  • Importing and exporting data into HDFS Sqoop and Flume and Kafka.
  • Troubleshoot and debug Hadoop ecosystem run-time issues.
  • Analyzing TEMPeffected code line objects and design suitable algorithms to address problem.
  • Assisted in performing unit testing of Map Reduce jobs using MRUnit.
  • Assisted in exporting data into Cassandra and writing column families to provide fast listing outputs.
  • Used Oozie Schedulersystems to automate teh pipeline workflow and orchestrate teh map reduce jobs that extract
  • Used Zookeeper for providing coordinating services to teh cluster.
  • Worked wif Hue GUI in scheduling jobs wif ease and File browsing, Job browsing, Metastore management.

Environment: Apache Hadoop, HDFS, Hive, Java, Sqoop, Spark, Cloudera CDH4, Oracle, MySQL, Tableau, Talend, Elastic search, Kibana, SFTP.

Confidential, Kansas, MO

Scala Spark Programmer

Responsibilities:

  • Worked wif BI team in teh area of Big Data Hadoop cluster implementation and data integration in developing large-scale system software.
  • Processing of incoming files using Spark native API.
  • Usage of Spark Streaming and Spark SQL API to process teh files.
  • Developed Spark scripts by using Scala shell commands as per teh requirement.
  • Processing teh schema oriented and non-schema oriented data using Scala and Spark.
  • Developed and designed system to collect data from multiple portal using kafka and tan process it using spark.
  • Developed and designed automate process using shell scripting for data movement and purging.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Developed Scala scripts, UDFFs using both Data frames/SQL/Data sets and RDD/MapReduce in Spark 1.6 for Data Aggregation, queries and writing data back into OLTP system through Sqoop.
  • Handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, TEMPEffective & efficient Joins, Transformations and other during ingestion process itself.
  • Designed, developed and did maintenance of data integration programs in a Hadoop and RDBMS environment wif both traditional and non-traditional source systems as well as RDBMS and NoSQL data stores for data access and analysis.

Environment: Hadoop, MapReduce, HDFS, Scala,Spark Cloudera Manager, Pig, Sqoop, ZooKeeper, Teradata, PL/SQL, MySQL, Windows, Hbase.

Confidential, Louisville, KY

Bigdata Developer

Responsibilities:

  • Written Map-Reduce code to process all teh log files wif rules defined in HDFS(as log files generated by different devices has different xml rules).
  • Developed and designed application to process data using Spark.
  • Developed MapReduce jobs, Hive & PIG scripts for Data warehouse migration project.
  • Developed and designed system to collect data from multiple portal using kafka and tan process it using spark.
  • Developing MapReduce jobs, Hive & PIG scripts for Risk & Fraud Analytics platform.
  • Developed Data ingestion platform using Sqoop and Flume to ingest Twitter and Facebook data for Marketing & Offers platform.
  • Developed and designed automate process using shell scripting for data movement and purging.
  • Installation & Configuration Management of a small multi node Hadoop cluster.
  • Installation and configuration of other open source software like Pig, Hive, Flume, Sqoop.
  • Developed programs in JAVA, Scala-Spark for data reformation after extraction from HDFS for analysis.
  • Written Hive jobs to parse teh logs and structure them in tabular format to facilitate TEMPeffective querying on teh log data.
  • Importing and exporting data into Impala, HDFS and Hive using Sqoop.
  • Responsible to manage data coming from different sources.
  • Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
  • Developed Hive tables to transform, analyze teh data in HDFS.
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map way.
  • Developed Simple to Complex Map Reduce Jobs using Hive and Pig.
  • Involved in running Hadoop Jobs for processing millions of records of text data.
  • Developed teh application by using teh Struts framework.
  • Created connection through JDBC and used JDBC statements to call stored procedures.
  • Developed Pig Latin scripts to extract teh data from teh web server output files to load into HDFS.
  • Developed teh Pig UDF’S to pre-process teh data for analysis.
  • Implemented multiple Map Reduce Jobs in java for data cleansing and pre-processing.
  • Moved all RDBMS data into flat files generated from various channels to HDFS for further processing.
  • Developed job workflows in Oozie to automate teh tasks of loading teh data into HDFS.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted data from Teradata into HDFS using Sqoop.
  • Writing teh script files for processing data and loading to HDFS.

Environment: Hadoop, MapReduce, HDFS, Pig, Hive, Java (jdk1.7), Flat files, Oracle 11g/10g, PL/SQL, SQL*PLUS, Windows NT, Sqoop.

Confidential, Herndon VA

Hadoop Developer

Responsibilities:

  • Processed data into HDFS by developing solutions.
  • Analyzed teh data using Map Reduce, Pig, Hive and produce summary results from Hadoop to downstream systems.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing teh data onto HDFS.
  • Developed data pipeline using flume, Sqoop and pig to extract teh data from weblogs and store in HDFS.
  • Used Sqoop to import and export data from HDFS to RDBMS and vice-versa.
  • Created Hive tables and involved in data loading and writing Hive UDFs.
  • Exported teh analyzed data to teh relational database MySQL using Sqoop for visualization and to generate reports.
  • Created HBase tables to load large sets of structured data.
  • Managed and reviewed Hadoop log files.
  • Involved in providing inputs for estimate preparation for teh new proposal.
  • Worked extensively wif HIVE DDLs and Hive Query language (HQLs).
  • Developed UDF, UDAF, UDTF functions and implemented it in HIVE Queries.
  • Implemented SQOOP for large dataset transfer between Hadoop and RDBMs.
  • Created Map Reduce Jobs to convert teh periodic of XML messages into a partition avro Data.
  • Used Sqoop widely in order to import data from various systems/sources (like MySQL) into HDFS.
  • Created components like Hive UDFs for missing functionality in HIVE for analytics.
  • Developing Scripts and Batch Job to schedule a bundle (group of coordinators) which consists of various.
  • Used different file formats like Text files, Sequence Files, Avro.
  • Cluster co-ordination services through Zookeeper.
  • Assisted in creating and maintaining Technical documentation to launching HADOOP Clusters and even for executing Hive queries and Pig Scripts.
  • Assisted in Cluster maintenance, cluster monitoring, adding and removing cluster nodes and
  • Trouble shooting.
  • Installed and configured Hadoop, Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and pre-processing.

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, HBase, Shell Scripting, Oozie, Oracle 11g.

Confidential

Java Developer

Responsibilities:

  • Implemented various J2EE standards and MVC framework involving teh usage of Struts, JSP, AJAX and servlets for UI design.
  • Used SOAP/ REST for teh data exchange between teh backend and user interface.
  • Utilized Java and MySQL from day to day to debug and fix issues wif client processes.
  • Developed, tested, and implemented financial-services application to bring multiple clients into standard database format.
  • Assisted in designing, building, and maintaining database to analyze life cycle of checking and debit transactions.
  • Created web service components using SOAP, XML and WSDL to receive XML messages and for teh application of business logic.
  • Involved in configuring web sphere variables, queues, DSs, servers and deploying EAR into Servers.
  • Involved in developing teh business Logic using Plain Old Java Objects (POJOs) and Session EJBs.
  • Developed autantication through LDAP by JNDI.
  • Developed and debugged teh application using Eclipse IDE.
  • Involved in Hibernate mappings, configuration properties set up, creating sessions, transactions and second level cache set up.
  • Involved in backing up database & in creating dump files. And also creating DB schemas from dump files. Wrote developer test cases & executed. Prepared corresponding scope & traceability matrix.
  • Implemented JUnit and JAD for debugging and to develop test cases for all teh modules.
  • Hands-on experience of Sun One Application Server, Web logic Application Server, Web Sphere Application Server, Web Sphere Portal Server, and J2EE application deployment technology.

Environment: Java multithreading, JDBC, Hibernate, Struts, Collections, Maven, Subversion, JUnit, SQL language, Struts, JSP, SOAP, Servlets, Spring, Hibernate, Junit, Oracle, XML, Putty and Eclipse.

Confidential

Java Developer

Responsibilities:

  • Involved in analysis and design phase of Software Development Life cycle (SDLC).
  • Used JMS to pass messages as payload to track statuses, milestones and states in teh workflows.
  • Involved in reading & generating pdf documents using ITEXT. And also merge teh pdfs dynamically.
  • Involved in teh software development life cycle coding, testing, and implementation.
  • Worked in teh health-care domain.
  • Involved in Using Java Message Service (JMS) for loosely coupled, reliable and asynchronous exchange of patient treatment information among J2EE components and legacy system
  • Developed MDBs using JMS to exchange messages between different applications using MQ Series.
  • Involved in working wif J2EE Design patterns (Singleton, Factory, DAO, and Business Delegate) and Model View Controller Architecture wif JSF and Spring DI.
  • Involved in Content Management using XML.
  • Developed a standalone module transforming XML 837 module to database using SAX parser.
  • Installed, Configured and administered WebSphere ESB v6.x
  • Worked on Performance tuning of WebSphere ESB in different environments on different platforms.
  • Configured and Implemented web services specifications in collaboration wif offshore team.
  • Involved in Creating dash board charts (business charts) using fusion charts.
  • Involved in creating reports for teh most of teh business criteria.
  • Involved in teh configurations set for Web logic servers, DSs, JMS queues and teh deployment.
  • Involved in creating queues, MDB, Worker to accommodate teh messaging to track teh workflows
  • Created Hibernate mapping files, sessions, transactions, Query and Criteria’s to fetch teh data from DB.
  • Enhanced teh design of an application by utilizing SOA.
  • Generating Unit Test cases wif teh help of internal tools.
  • Used JNDI for connection pooling.
  • Developed ANT scripts to build and deploy projects onto teh application server.
  • Involved in implementation of continuous build tool as Cruise control using Ant
  • Used Star Team as version controller.

Environment: JAVA/J2EE, HTML, JS, AJAX, Servlets, JSP, XML, XSLT, XPATH, XQuery, WSDL, SOAP, REST, JAX-RS, JERSEY, JAX-WS, Web Logic server 10.3.3, JMS, ITEXT, Eclipse, JUNIT, Star Team, JNDI, Spring framework - DI, AOP, Batch, Hibernate.

Confidential

Jr. Java Developer

Responsibilities:

  • Involved in teh requirement analysis, design, and development of teh new NCP project.
  • Involved in teh design and estimation of teh various templates, components which were developed using Day CMS (Communique).
  • Teh CMS and Server side interaction was developed using Web services and exposed to teh CMS using JSON and JQuery.
  • Designed and developed Struts like MVC 2 Web framework using teh front-controller design pattern, which is used successfully in a number of production systems.
  • Worked on Java Mail API. Involved in teh development of Utility class to consume messages from teh message queue and send teh emails to customers.
  • Normalized Oracle database, conforming to design concepts and best practices.
  • Used JUnit framework for unit testing and Log4j to capture runtime exception logs.
  • Performed Dependency Injection using spring framework and integrated wif Hibernate and Struts frameworks.
  • Hands on experience creating shell and perl scripts for project maintenance and software migration. Custom tags were developed to simplify JSP applications.
  • Applied design patterns and OO design concepts to improve teh existing Java/JEE based code base.
  • Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.
  • Used Validator framework of teh Struts for client side and server side validation.
  • Teh UI was designed using JSP, Velocity template, JavaScript, CSS, JQuery and JSON.
  • Enhanced teh FAS system using struts MVC and iBatis.
  • Involved in developing web services using Apache XFire & integrated wif action mappings.
  • Developed Velocity templates for teh various user interactive forms that triggers email to alias. Such forms largely reduced teh amount of manual work involved and were highly appreciated.
  • Used Internalization, Localizations, tiles and tag libraries to accommodate for different locations.
  • Used JAXP for parsing & JAXB for binding.
  • Co-ordinate Application testing wif teh help of testing team.
  • Involved in writing services to write core logic for business processes.
  • Involved in writing database queries, stored procedures, functions etc
  • Deployed EJB Components on Web Logic, Used JDBC API for interaction wif Oracle DB.
  • Involved in Transformations using XSLT to prepare HTML pages from xml files.
  • Enhanced Ant Scripts to build and deploy applications
  • Involved in Unit Testing, code review for teh various enhancements
  • Followed coding guide lines while developing workflows.
  • TEMPEffectively managed teh quality deliverables to meet deadlines.
  • Involved in end to end implementation of teh application.

Environment: Java 1.4, J2EE (EJB, JSP/Servlets, JDBC, XML), Day CMS, XML, My Eclipse, Tomcat, Resin, Struts, iBatis, Web logic App server, DTD, XSD, XSLT, Ant, SVN.

We'd love your feedback!