We provide IT Staff Augmentation Services!

Hadoop/mongodb Developer Resume

0/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 7+ years of experience in the field of IT including 3 years of experience in Hadoop ecosystem and 4+ years of experience as a java/J2EE developer with different client environments.
  • Expertise in Client / Server and application development using Java, J2ee technologies.
  • Good experience in creating and consuming Rest Web Services.
  • Hands on Experience on Enterprise Data ware house and Data mart Development and optimization using ETL tools and PLSQL, to provide quick and efficient database platform for reporting.
  • Involved in extraction, transformation and loading of data directly from different source systems like flat files, Excel, Oracle and SQL Server.
  • Hands on experience in using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Oozie, Spark, Impala and Flume.
  • Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
  • Well versed with core Java concepts like collections, multithreading and serialization.
  • Experienced in developing Spark scripts for data analysis in both python and Scala.
  • Hands on experience on Hortonworks and Cloudera Hadoop environments.
  • Rich work experience in Open Source frameworks like Struts 1.2, Struts 2.0, and spring.
  • Hands on experience in using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Oozie, Spark, Impala and Flume.
  • Good Hands on experience on NOSQL databases such as HBase, Cassandra and MongoDB.
  • Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables. Loaded Data into Oracle Tables using SQL Loader
  • Hands on Experience in Influencing and driving the Customers in implementing BigData solutions.
  • Reduce, HDFS, Pig, Hive, Sqoop, Flume, Knox, Storm, Kafka, Oozie, and HBase.
  • Used Oozie to orchestrate the map reduce jobs that extract the data on a timely manner.
  • Written Map Reduce java programs to analyze the log data for large - scale data sets.
  • Good experience in Shell Scripting, SQL Server, UNIX and Linux, Open stack. Configuration of SQL server on Clusters.
  • Implemented different machine learning techniques in Scala using Spark machine learning library.
  • Experience in design and code reviews of C and C++ code.

TECHNICAL SKILLS

Hadoop/Big Data Technologies: HDFS, Map Reduce, Datastax, HBase, Pig, Hive, Sqoop, Scala, Spark, Kafka, Yarn, Impala, Ambari, ETL, Cassandra, Cloudera, Oozie, Hadoop Distributions, Big data, Accumulo.

Programming / Scripting Language: Java, Python, SQL, PL/SQL, Shell Scripting, Storm, PIG Latin, JSP & Servlets, JavaScript, XML, HTML

Frameworks: MVC, Spring, Struts, Hibernate, JSF, EJB, JMS

Web Technologies: HTML, XML, Ajax, SOAP, Java Script, CSS, JSP

Databases: SQL Server, MySQL, Oracle11g, My SQL, Oracle 8i/9i, MySQL

Database Tools: MS SQL Server, Oracle, My SQL, Splunk, Eclipse, Oracle SQL Developer

Operating Systems: Linux, UNIX, Windows, Centos, Windows 7, Windows 8, XP, Windows vista

NoSQL Databases: HBase, Cassandra and MongoDB

Application Server: Apache Tomcat, JBoss, Web Logic, Web Sphere, Web logic

Methodologies: Scrum, Agile, Waterfall

PROFESSIONAL EXPERIENCE

Hadoop/MongoDB Developer

Confidential, Plano, TX

Responsibilities:

  • Experience in collecting metrics for Hadoop clusters using Ambari. Used Zookeeper to provide coordination services to the cluster.
  • Participated in choosing the tech stack (Scala, Akka, Cassandra) for the new micro services.
  • Deep and extensive knowledge with HDFS, Spark, Kafka, MapReduce, Pig, Hive, HBase, Sqoop, Storm, Flume, Oozie, Zookeeper, Cassandra, MongoDB etc.
  • Implemented AWS provides a variety of computing and networking services to meet the needs of applications.
  • Proficiency in Spark using Scala for loading data from the local file systems like HDFS, Amazon S3, Relational and NoSQL databases using Spark SQL, and Import data into RDD and Ingesting data from a range of sources using Spark Streaming.
  • Implemented Microservices Gateway to authenticate and generate authorization token. Used OAuth 2.0 based JWT tokens Authentication and Authorization approach.
  • Hands on Experience on Enterprise Data ware house and Data mart Development and optimization using ETL tools and PLSQL, to provide quick and efficient database platform for reporting.
  • Worked on migrating MapReduce programs into Spark transformations using Scala.
  • Regular Maintenance of Commissioned/decommission nodes as disk failures occur using Cloudera Manager.
  • Built Restful data services as Microservices for SVOC Edge Timeline with Spring Boot and Spring Data framework. Created Docker image and made Docker container.
  • Used Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scala and databases such as HBase.
  • Extensive work in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading using Informatica.
  • Experience in understanding the clients Big Data business requirements and transform it into Hadoop centric technologies.
  • Experience in Hadoop configuration of clusters using Cloudera Manager and Apache Ambari.
  • Implemented Map Reduce jobs using Java API, PIG Latin and HQL.
  • Launching and setup of Hadoop related tools on AWS, which includes configuring different components of Hadoop.
  • Performed both major and minor upgrades to the existing Cloudera Hadoop cluster.
  • Good knowledge on NoSQL Databases and concepts in Document Oriented Databases like MongoDB and Hbase.
  • Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.
  • Experience in distributed Big Data storages like HDFS, Hbase, Accumulo, Mongodb and Cassandra.
  • Experience managing databases (RDBMS. NoSQL - MongoDB, Hadoop).
  • Successfully loaded files to Hive and HDFS from MongoDB, Cassandra, and Hbase.
  • Build and maintain scalable data pipelines using the Hadoop ecosystem and other open source.
  • Worked on migrating MapReduce programs into Spark transformations using Spark and Scala.
  • Experience in Big Data technologies and Hadoop ecosystem projects like HDFS, Map Reduce, Spark, Hive, NoSQL databases, HBase, Oozie, Sqoop, Pig, Storm, Kafka, Impala, HCatalog, Zookeeper, Flume, and Amazon Web Services.
  • Used Kafka to load data in to HDFS and move data into NoSQL database. Populated HDFS and Cassandra with huge amounts of data using Apache Kafka.

Environment: Pig, Hive, HDFS, Pig, Sqoop, Oozie, NoSQL Databases, RDBMS, Data ware housing, Big data ETL, Impala, Hbase, Storm, Flume, Spark, Scala, HBase, Kafka, Agile, AWS, Accumulo, Cassandra, Mongodb, Cloudera.

Hadoop Developer/Scala Developer

Confidential, Dallas, TX

Responsibilities:

  • Developed parser and loader map reduce application to retrieve data from HDFS and store to HBase and Hive.
  • Excellent understanding and knowledge of NOSQL databases like HBase, Cassandra.
  • Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala, Python.
  • Experience in Performance Tuning of mappings, ETL Procedures and process.
  • Developed and supported the extraction, transformation and load process (ETL) for a Data.
  • Implementation of Real time applications using Apache Storm, Trident Storm, Kafka, and Apache ignite Memory grid and Accumulo.
  • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data.
  • Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.
  • Developed Mongo DB embedded documents from java code using spring data MongoDB.
  • Developed MapReduce programs to extract and transform the data sets and results were exported back to RDBMS using Sqoop.
  • Store and manage the data coming from the users in Mongo DB database.
  • Extensive exposure on NoSQL Databases, design and development using HBase, Cassandra.
  • Experience with configuration of Hadoop Ecosystem components: Hive, HBase, Pig, Sqoop, Zookeeper, Flume, Storm, and Spark.
  • Hands on NoSQL database experience with HBase, MongoDB and Cassandra.
  • Used Amazon Web Services (AWS) to handle small data sets. Migrated the data from cluster into the AWS environment.
  • Extensive experience in working with HDFS, Pig, Hive, Sqoop, Flume, Oozie, MapReduce, Zookeeper, Kafka, Spark and HBase.
  • Developed Spark code in Scala and Spark-SQL environment for faster testing and processing of data.
  • Experience in Datameer as well as big data hadoop. Experienced in NoSQL databases such as HBase, and MongoDB.
  • Store and manage the data coming from the users in Mongo DB database.
  • Experience working with big data and real time/near real time analytics and big data platforms like Hadoop, Spark using programming languages like Scala and Java.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDD, Scala and Python.
  • Experience in Importing and Exporting data from different databases like My Sql, Oracle into HDFS and Hive using Sqoop.
  • Processed HDFS data and created external tables using Hive and developed scripts to ingest data into the tables across the Cluster.
  • Excellent understanding and knowledge of NOSQL databases like HBase.

Environment: Hive, Pig, Kafka, Impala, Spark, Scala, Storm, AWS, BigData, Apache Hadoop, HDFS, NoSQL Databases, Zookeeper, Map Reduce, Sqoop, Data ware housing, SQL Server, MYSQL, RDBMS, Accumulo, Cassandra, Mongodb, Hortonworks.

Java developer

Confidential

Responsibilities:

  • Involved in development of Care Plans module, which provides a comprehensive library of problems, goals and approaches.
  • You have the option of tailoring (adding, deleting, or editing problems, goals and approaches) these libraries and the disciplines you will use for your care plans.
  • Expertise in Core Java concepts like OOPS, Multithreading/Synchronization, Collections Framework, Exception handling etc.
  • Strong experience in application development using various frameworks such as Jakarta Struts Framework, Java Server Faces, Spring Framework, Spring MVC, Hibernate ORM, JQuery.
  • Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, UNIX).
  • Experience in developing web services in REST and SOAP on both provider and consumer side.
  • I have developed custom web (Internet/ Intranet) and distributed applications using java, HTML, DHTML, CSS, XML, JavaScript, J2EE and java EE5.
  • Worked on spring modules like Dependency Injection (DI), Aspect Oriented Programming (AOP) and Spring Mail, Spring JMS.
  • Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.
  • Extensively used hibernate named queries, criteria queries, Hibernate Query Language (HQL) and Optimistic Locking and Caching to process the data from the database.
  • Developed the micro services platform to support a multitude of enterprise level applications. Mentor to developers on JS best practices and server-side development.
  • Well expertise in Designed HTML pages as per the requirements & integration with AngularJS controllers.
  • Experience in Developing Java script, Micro services applications development.
  • Extensively used Core Java, Servlets, JSP and XML.Used Struts 1.2 in presentation tier. Involved in writing JSP and JSF components.
  • Application was based on MVC with JSP serving as presentation layer, Servlets as controller and Hibernate in business layer to access to Oracle Database.
  • Developed the DAO layer for the application using Spring Hibernate Template support. Specialization in server side Java & J2EE Application.

Environment: J2EE, Java/JDK, JDBC, JSP, Servlets, JavaScript, EJB, JNDI, JavaBeans, XML, XSLT, Hibernate, Spring, Oracle 9i, Eclipse, HTML/ DHTML, SVN.

Java Developer

Confidential

Responsibilities:

  • Participate in project planning sessions with business analysts and team members to analyze business IT Requirements and translate business requirements into working model.
  • Expertise in Upgrading Oracle database and applying Oracle PSU and CPU patches.
  • Expertise in Oracle Enterprise Manager (OEM) and setup of Oracle Grid Control.
  • Experience in developing Web Services using SOAP and XML.
  • Experience with Python Object Oriented programming and Multithreading.
  • Involved in Analysis, Design, and Implementation of software applications using Java, J2EE, XML and XSLT, and Web Service (REST, WSDL).
  • Develop business layer components using Spring & Hibernate, and GUI using JSP & J Query.
  • Experienced in developing Web Services with Python programming language.
  • Building code using PPM tool. Tested web services through SOAP UI.
  • Coordinating all the interface teams to deliver the project in time.
  • Implemented application level persistence using Hibernate and Spring.
  • Worked on integrating python with Web development tools and Web Services.
  • Experience in Designed and developed a micro service which reduced the burden of doing redundant jobs of three products.
  • Used Collections in Python for manipulating and looping through different user defined objects.
  • Experience in Implementation and Upgrade of Oracle Applications, Fusion Applications, SOA, Web Logic and Oracle Databases on Linux platform.
  • Extensive C/C++ programming skills including multi-threading, Network Programming, RESTful web service, STL, PThread and other libraries.
  • Developed Servlets and JSPs based on MVC pattern using Spring Framework.
  • Worked in collaboration with different vendors and other application teams. Prepared code review documents and test scripts.
  • Develop and maintains complex outbound notification applications that run on custom architectures, using diverse technologies including Core Java, J2EE, SOAP, XML, JMS, JBoss and Web Services.

Environment: Core java, Servlets, Jsp, Struts1.3, Hibernate and Web services, Java, J2EE, SQL, PL/SQL, MySQL, Java, SQL scripting, Linux shell scripting, Eclipse.

Java Developer

Confidential

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) of the application like Requirement gathering, Design, Analysis and Code development.
  • Developed a prototype of the application and demonstrated to business users to verify the application functionality.
  • Integrated Spring with Hibernate framework and created Hibernate annotations for mapping an object-oriented domain model to traditional relational database.
  • Built data synchronization application using Java multithreading to aid client to maximize profits.
  • Responsible for implementing web services Rest.
  • Developed Object - Relational (O/R) mapping using Hibernate 3.0.Developed Data Access object (DAO) persistence layer using Hibernate 3.0.
  • Conversant with tools like Eclipse, RAD/WSAD, RSA, Spring Tool Suite, Jdeveloper and InteliJ.
  • Designed asynchronous messaging using JMS to exchange of critical business data and events among J2EE components and legacy system.
  • Apache AXIS2 web service implementation including SOAP and WSDL engine. Coded Rest API for Product service using spring boot.
  • Implemented producer/consumer for Rest based web service using JAX-RS annotations, Jersey implementation and Http client.
  • Involved in writing Spring Configuration XML file that contains declarations and other dependent objects declaration.
  • Involved in design and implementation of various business scenarios under trading flow using spring.
  • Hibernate used as Persistence framework mapping the ORM objects to tables. Developed HQL, SQL queries.
  • Development of Enterprise Java Bean Components in Java / j2ee.
  • Developed C++, SOAP web services on UNIX and Linux platforms using gsoap. Experience in using the C++ Standard Template Library (STL)
  • Experience in working with Python, Java, C++, HTML, XML, CSS, JavaScript, JQuery, Bootstrap, JSON, Angular.JS and Node.JS.
  • Developed batch jobs for reading, parsing flat and XML files and store the data into Database.
  • Created and deployed web pages using HTML, JSP, JavaScript and CSS.
  • Expert in creating various PL/SQL stored procedures, views, functions and temporary tables for data input to the Crystal Reports.

Environment: Core java, servlets, Jsp, Struts 1.3, PVCS, PPM, RAD, DB2, Oracle. Java, J2EE, HTML, Hibernate, DHTML, CSS, JavaScript, XML, EJB, Web logic 8.1, SQL Server 2008R2, CentOS, UNIX, LINUX, Windows 7/Vista/X.

We'd love your feedback!