We provide IT Staff Augmentation Services!

Sr. Java/hadoop Developer Resume

4.00/5 (Submit Your Rating)

Hartford, CT

SUMMARY:

  • 9+ years of experience in SDLC with key emphasis on the trending Big Data and Java Technologies - Spark, Scala, Spark Mlib, Hadoop, Tableau, Cassandra, Java, J2EE.
  • Architect, design & develop Big Data Solutions practice including set up Big Data roadmap, build supporting infrastructure and team to provide Big Data.
  • Architecting, Solutioning and Modeling DI (Data Integrity) Platforms using sqoop, flume, kafka, Spark Streaming, Spark Mllib, Cassandra.
  • Strong experience in migrating data warehouses and databases into Hadoop/NoSQL platforms.
  • Strong expertise on Amazon AWS EC2, Dynamo DB, S3, Kinesis and other services
  • Expertise in data analysis, design and modeling using tools like ErWin.
  • Expertise in Big Data architecture like hadoop (Azure, Hortonworks, Cloudera) distributed system, MongoDB, NoSQL.
  • Expertise in Service Oriented Architectures (SOA- Web Services) using Apache Axis, WebLogic, JBoss andEJBWeb service framework.
  • Used Mule ESB in designing the application as a middleware between the third-party system and the customer side system.
  • Hands on experience on Hadoop /Big Data related technology experience in Storage, Querying, Processing and analysis of data.
  • Expertise in archtiecting Big data solutions using Data ingestion, Data Storage
  • Strong Experience in Front End Technologies like JSP, HTML5, JQuery, JavaScript, CSS3.
  • Worked on windows server AD configuration and Kerberos protocol.
  • Experienced with Perl, Shell scripting and test automation tools like Selenium RC, Web Driver and Selenium Grid.
  • Developed Python Mapper and Reducer scripts and implemented them using Hadoop streaming.
  • Experienced in customizing Selenium API to suit in testing environment.
  • Integration of Mule ESB system while utilizing MQ Series, Http, File system and SFTP transports.
  • Solid Knowledge of My SQL and Oracle databases and writing SQL Queries.
  • Proficient in developing the application using JSF, Hibernate, CoreJava, JDBC and Groovy and Grails presentation layer components using JSPs, Java script, XML and HTML Cassandra, Cucumber, OLE and Continuous deployment, API, Angular JS along with Web service, REST, GemFire, Rabbit MQ, Spring Boot..
  • Experience in Back End Development including Web services, Data service layerswith service desk experience.
  • Designed and coded Hibernate, struts for mapping, configurations and HQL for enhancement and new module development of Transport Optimization, Planning and Scheduling Web app.
  • Used Groovy and Grails with spring, Java, J2EE for user interface.
  • Initiated the Automation framework usingSelenium Web Driver to run test cases in multiple browsers and platforms.
  • Highly motivated software engineer and experience in developing in web applications using Java script, Backbone.js and Coffee script technologies.
  • Utilized integration Patterns, integration tools, EAI, Transformations, XML Schemas, and XSLT.
  • Used Quartz connector to schedule the batch jobs.
  • Architected Integrations using Mule Soft ESB environments for both on premise and Cloud hub environments.
  • • Experience in developing interfaces between Sales force and Oracle ERP using Informatica Cloud/Mule ESB technologies.
  • • Implemented flows for sales force outbound / inbound calls and business process.
  • • Experience in Mulesoft Any point API platform on designing and implementing Mule APIs.
  • Good knowledge on Soap UI tool to unit testing SOA based applications.
  • Ability to understand and use design patterns in application development.
  • Very good knowledge in different development methodologies like SDLC and Agile.
  • Experienced in developing applications using HIBERNATE (Object/Relational mapping framework) and involves in working on service desk client.
  • Experienced in developing Web Services using JAX-RPC, JAXP, SOAP and WSDL. Also knowledgeable in using WSIF (Web Services Invocation Framework) API.
  • Experience in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages and Cursors for Oracle, SQL Server, DB2 and Sybase.
  • Java 8, J2EE, spring (MVC, Data-JPA, Security),Hibernate, Jenkins or Bamboo, HTML 5, JSP, JavaScript, JQuery, Ajax, Angular JS.

TECHNICAL SKILLS:

Big Data: Hive, Hadoop, oozie, sqoop, Storm, Kafka, Elastic Search, HDFS, Zoo Keeper, Map Reduce, hive, pig, spark, flume.

J2EE Technologies: Servlets, JSP, JDBC, JNDI, OSGI, EJB, RMI, ASP.

Programming Languages: Java 8, C, C++, Pig Latin, HQL, R, Python, XPath, Spark.

Frameworks: Jakarta Struts, Spring, Spring MVC, JSF (Java Server Faces), Hibernate, Tiles, I Batis, Validator, Cucumber, OLE and Continuous deployment, micro services, Groovy.

Web Technologies: HTML, DHTML, Cassandra, API, Angular JS along with Web service, REST, Gem Fire, Rabbit MQ, Java script with J query, Python, Ext JS, AJAX, CSS,CMS, Yahoo UI, ice faces API, Angular, Node.js, Backbone.js.

XML Technologies: XML 1.0, XSLT, XSL, HTML5, DHTML, J query,, XSL / XSLT /XSL-FO, JNDI, LDAP, SOAP, AXIS 2

Application/Web Servers: IBM Web Sphere 5.X/6.0/7.0/8.0, IBM HTTP server 8.x, Web Logic 7.x/8.x/9.0, Web Logic Portal 5.x, J Boss 4.0, j BPM, Apache Tomcat, OC4J, Docker.

NO SQL Data Base: Cassandra, mongo DB

Databases: Oracle 12c /10g/11g, SQL Server, My SQL, DB2.

Messaging Systems: JMS, IBM MQ-Series

IDE Tools: IBM Web Sphere Studio Application Developer (WSAD) RSA, RAD, Eclipse /RCP, J developer, Net Beans .

PROFESSIONAL EXPERIENCE:

Confidential, Hartford, CT

Sr. Java/Hadoop Developer

Responsibilities:

  • Implementation of Big Data ecosystem (Hive, Impala, Sqoop, Flume, Spark, Lambda) with Cloud Architecture
  • Used Talend for Big data Integration using Spark and Hadoop
  • Used Microsoft Windows server and authenticated client server relationship via Kerbros protocol.
  • Experience on BI reporting with At Scale OLAP for Big Data.
  • Implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing Big Data technologies such asHadoop, Map Reduce Frameworks, HBase, Hive
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • I have Working experience in Middleware Integration product Mulesoft
  • Designed and Developed Real time Stream processing Application using Spark, Kafka, Scala and Hive to perform Streaming ETL and apply Machine Learning.
  • Identify query duplication, complexity and dependency to minimize migration efforts
  • Technology stack: Oracle, Hortonworks HDP cluster, Attunity Visibility, Cloudera Navigator Optimizer, AWS Cloud and Dynamo DB.
  • Experience in AWS, implementing solutions using services like (EC2, S3, RDS, Redshift, VPC)
  • Worked on Talend Magic Quadrant for performing fast integration tasks.
  • Worked as a Hadoop consultant on (Map Reduce/Pig/HIVE/Sqoop).
  • Worked with Spark and Python.
  • Worked using Apache Hadoop ecosystem components like HDFS, Hive, Sqoop, Pig, and Map Reduce.
  • Lead architecture and design of data processing, warehousing and analytics initiatives.
  • Worked with AWS to implement the client-side encryption as Dynamo DB does not support at rest encryption at this time.
  • Exploring with the Spark for improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, Pair RDD's, Spark YARN.
  • Used Data Frame API in Scala for converting the distributed collection of data organized into named columns.
  • Performed data profiling and transformation on the raw data using Pig, Python, and Java.
  • Experienced with batch processing of data sources using Apache Spark.
  • Developing predictive analytic using Apache Spark Scala APIs.
  • Involved in working of big data analysis using Pig and User defined functions (UDF).
  • Created Hive External tables and loaded the data into tables and query data using HQL.
  • Used Sqoop to efficiently transfer data between databases and HDFS and used Flume to stream the log data from servers.
  • Involved in developing and configuration of enterprise components using Mulesoft ESB.
  • Implement enterprise grade platform (mark logic) for ETL from mainframe to NO SQL (cassandra).
  • Experience on BI reporting with At Scale OLAP for Big Data.
  • Responsible for importing log files from various sources into HDFS using Flume
  • Worked on tools Flume, Storm and Spark.
  • Expert in performing business analytical scripts using Hive SQL.
  • Best practices for designing integration modules using ESB& Data Integrator modules
  • Implemented continuous integration & deployment (CICD) through Jenkins for Hadoop jobs.
  • Worked in writing Hadoop Jobs for analyzing data using Hive, Pig accessing Text format files, sequence files, Parquet files.
  • Experience in different Hadoop distributions like Cloudera (CDH3 & CDH4) and Horton Works Distributions (HDP) and MapR.
  • Experience in integrating oozie logs to kibana dashboard.
  • Extracted the data from MySQL, AWS RedShift into HDFS using Sqoop.
  • Developed Spark code using Scala and Spark-SQL for faster testing and data processing.
  • Imported millions of structured data from relational databases using Sqoop import to process using Spark and stored the data into HDFS in CSV format.
  • Developed Spark streaming application to pull data from cloud to Hive table.
  • Used Spark SQL to process the huge amount of structured data.
  • Assigned name to each of the columns using case class option in Scala.
  • Implemented Spark GraphX application to analyze guest behavior for data science segments.
  • Enhancements to traditional data warehouse based on STAR schema, update data models, perform Data Analytics and Reporting using Tableau.

Environment: Big Data, SparkSpark, YARN, HIVE, Pig, Scala, Python, Hadoop, AWS, Dynamo DB, Kibana, Cloudera, EMR, JDBC, Redshift, NOSQL, Sqoop, MYSQL.

Confidential, Fort Worth, TX

Sr. Java/ Hadoop Developer

Responsibilities:

  • Involved in BigDataProject Implementation and Support.
  • Involved in the coding and integration of several business critical modules of CARE application using spring, Hibernate and REST web services on Web Sphere application server.
  • Implemented Installation and configuration of multi-node cluster on Cloud using AWS on EC2.
  • Designed and developed Enterprise Eligibility business objects and domain objects with Object Relational Mapping framework such as Hibernate.
  • UsedHiveto analyze data ingested intoHBaseby usingHive-HBaseintegration and compute various metrics for reporting on the dashboard
  • Developed the Web Based Rich Internet Application (RIA) using JAVA/J2EE (spring framework).
  • Used the light weight container of the Spring Frame work to provide architectural flexibility for inversion of controller (IOC).
  • Utilized Oozie workflow to run Pig and Hive Jobs Extracted files from Mongo DB through Sqoop and placed in HDFS and processed.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Involved in end to end implementation ofBigdatadesign.
  • Developed and Implemented new UI's using Angular JS and HTML.
  • Developed Spring Configuration for dependency injection by using Spring IOC, Spring Controllers.
  • All the data was loaded from our relational DBs to HIVE using Sqoop. We were getting four flat files from different vendors. These were all in different formats e.g. text, EDI and XML formats
  • Objective of this project is to build a data lake as a cloud based solution in AWS using Apache Spark and provide visualization of the ETL orchestration using CDAP tool.
  • Proof-of-concept to determine feasibility and product evaluation of Big Data products
  • Writing Hive join query to fetch info from multiple tables, writing multiple Map Reduce jobs to collect output from Hive
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
  • AWS Cloud and On-Premise environments with Infrastructure Provisioning / Configuration.
  • Worked on writing Perl scripts covering data feed handling, implementingmark logic, communicating with web-services through SOAP Lite module and WSDL.
  • Involved in developing Map-reduce framework, writing queries scheduling map-reduce
  • Developed the code for Importing and exporting data into HDFS and Hive using Sqoop
  • Installed and configured Hadoop and responsible for maintaining cluster and managing and reviewing Hadoop log files.
  • Developed Shell, Perl and Python scripts to automate and provide Control flow to Pig scripts.
  • Design of Redshift Data model, Redshift Performance improvements/analysis
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Worked on configuring and managing disaster recovery and backup on Cassandra Data.
  • Performed File system management and monitoring on Hadoop log files.
  • Implemented partitioning, dynamic partitions and buckets in HIVE.
  • Developed customized classes for serialization and Deserialization in Hadoop
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Implemented a proof of concept deploying this product in Amazon Web Services AWS.
  • Involved in migration of data from existing RDBMS (oracle and SQL server) to Hadoop using Sqoop for processing data.

Environment: Pig, Sqoop, Kafka, Apache Cassandra, Oozie, Impala, Cloudera, AWS, AWS EMR, Redshift, Flume, Apache Hadoop, HDFS, Hive, Map Reduce, Cassandra, Zookeeper, MySQL, Eclipse, Dynamo DB, PL/SQL and Python.

Confidential, NYC, NY

JAVA/J2EE DEVELOPER

Responsibilities:

  • Used Web Logic to build and deploy the application.
  • Created stubs to consume Web services.
  • Automated tests were coded in Java Script with Frog logic’s Squish or Smart Bear’s Test Complete for client applications and coded in Java with Selenium for web application testing.
  • Developed automation testing process using Seleniumand QTP which involves study of client testing requirements, analyzing the feasible testing strategies and development of automated test scripts which also includes testing and finally deployment of the test scripts.
  • Used spring framework to achieve loose coupling between the layers thus moving towards Service Oriented Architecture (SOA) exposed throughRESTful.
  • Involved in performing Unit and Integration testing (Junit)
  • Involved in building EJB Session/Entity beans to maintain Transaction Management across the application.
  • Built Web pages that is more user-interactive using Java script and Angular JS.
  • Groovy allows using the primitive’s types as a short form for the variable declaration and the compiler translates this into the object.
  • Extensively used SpringJDBCin data access layer to access and update information in the database.
  • Developed Web Services to create reports module and send it to different agencies and premium calculation for manual classes usingSOAPand Restful web services and rich faces components.
  • Involved in writing SpringMVCcontrollers and writing custom validations.
  • Working on Struts Framework for developing the front-end application and extensively. Spring as middle tier for entire application.
  • Used JAX-WS (SOAP) and JAX-RS (REST) to produce web services and involved in writing programs to consume the web services.
  • Java 8, J2EE, spring (MVC, Data-JPA, Security),Hibernate, Jenkins or Bamboo, HTML 5, JSP, JavaScript, JQuery, Ajax, Angular JS
  • Involved in working with Struts Tiles for the common look and feel for a web application.
  • Working on Web Services usingJavaAPI for XML Services (JAX-WS) and supporting, building, deploying Web APIs Services.
  • Working as a part of team from business transfer, development, testing, code review, build implementation and support.
  • Wrote PL/SQL statements according to the need using Oracle 10g database.
  • Working on an internal web-based client server application built with Struts 2 Framework using Oracle backend Database, working on establishing the relation for the different beans using the Hibernate 3.1.
  • Involved in writing various components using Spring AOP and IOC framework.
  • Involved in writing JSP and JSF components. Used JSTL Tag library (Core, Logic, Nested, Beans and Html tag lib's) to create standard dynamic web pages.
  • Developed connection to the backend usingJDBCafter building the Entity Beans as Bean Managed Persistence Entity Beans.
  • Designed and Developed the UI Framework using SpringMVCand AngularJS.
  • Creation of REST Web Services for the management of data using Apache CXF and Docker.
  • Implementation of EJB as entry point for web services. Effectively prepared for and organized technical inspections to review code and design models with peers and software architects.
  • Identified the defects through Selenium and ensured that business processes deliver the expected results and remain reliable throughout the production release.
  • Spring 3.x is used as framework to write the application code andRESTfulweb services for external clients.
  • Designed and developed backend application servers usingPython.
  • Managed application deployment using Python.
  • Upgraded Python 2.3 to Python 2.5, this required recompiling mode Python to use Python 2.5.
  • Enhanced user experience by designing new web features using MVC Framework like Backbone.js, and node.js.
  • UsedJDBCconnectivity for connecting to the Oracle 8.0 database.
  • Developed major websites and services by includingMongoDBas backend software.
  • Good experience in creating and consuming Restful and SOAP Web Services.
  • Developing ability to move and consolidate critical information for the businesses and financial account data Using EJB 2.1 and Hibernate for performing the Database Transactions.

Environment: Javaand, Struts Framework, J query, Oracle, HTML, Mark logic, micro services, Python, Groovy, PL/SQL, JDBC, Mark logic, Talend, Hibernate, Ant, WSDL, EJB .

Confidential

JAVA DEVELOPER

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) of the application like Requirement gathering, Design, Analysis and Code development.
  • Developedhibernatemapping using db model.
  • Involved in designing and developing Customized tags using JSP tag lib
  • Implemented Model View Control (MVC) architecture using Struts Framework and Spring framework
  • Developed browser-basedJavaServer Faces front-end to an AS/400 system
  • Used Ajax to provide dynamic features where applicable.
  • ImplementedRESTfulweb services to communicate with components of other Sourcing systems within the firm and to provide data to the reporting team.
  • Used MVC pattern for GUI development in JSF and worked closely with JSF lifecycle, Servlets and JSPs are used for real-time reporting which is too complex to be handled by the Business Objects
  • Used Jira for bug tracking and project management.
  • Prepared user documentation with screenshots for UAT (User Acceptance testing).
  • Implemented Struts Validation Framework for Server side validation.
  • Developed JSP's with Custom Tag Libraries for control of the business processes in the middle-tier and was involved in their integration.
  • Developed Web services (SOAP) through WSDL in Apache Axis to interact with other components.
  • Implemented EJBs Session beans for business logic.
  • Used parsers like SAX and DOM for parsing xml documents and used XML transformations using XSLT.
  • Wrote stored procedures, triggers, and cursors using Oracle PL/SQL.
  • Used Rational Clear Case as Version control.
  • ImplementedJava/J2EE Design patterns like Business Delegate and Data Transfer Object (DTO), Data Access Object and Service Locator.
  • Interact with clients to understand their needs and propose design to the team to implement the requirement.
  • Built an online system using XML, Java script, AJAX, Strut 2.0, JDBC
  • Involved in technical Documentation for the module
  • Designed and created SQL Server Database, Stored Procedures

Environment: Java, JSP, JDBC, Cassandra, API, Python, J query, Angular JS along with Web service, REST, Spring Core, Struts, Hibernate, Design Patterns, XML, Oracle, Apache Axis, ANT, Junit, UML, Web services, SOAP, XSLT, Jira.

We'd love your feedback!