Sr. Big Data Developer Resume
Long Beach, CA
SUMMARY:
- Experienced Java developer with over 8 years of experience in programming and hands on experience of 4 years in Big Data environment.
- Strong experience in developing Enterprise and Web applications on n - tier architecture using Java/J2EE based technologies such as Servlets, JSP, Spring, Hibernate, Struts, EJBs, Web Services, XML, JPA, JMS, JNDI and JDBC.
- Experienced with programming language such as C, C++, Xpath, Core Java and JavaScript.
- Extensive experience in building and deploying applications on Web/Application Servers like Web logic, Web sphere, and Tomcat.
- Expertise in developing the presentation layer components like HTML, CSS, JavaScript, JQuery, XML, JSON, AJAX and D3
- Having good experience in Hadoop framework and related technologies like HDFS, MapReduce, Pig, Hive, HBase, Sqoop and Oozie.
- In depth experience and good knowledge in using Hadoop ecosystem tools like MapReduce, HDFS, Pig, Hive, Kafka, Yarn, Sqoop, Storm, Spark, Oozie, and Zookeeper.
- Expertise in Java Script, JavaScript MVC patterns, Object Oriented JavaScript Design Patterns and AJAX.
- Proficient in Swing, Core Java, XML (XSLT and Schema), HTML, and JavaScript
- Experience in working with MapReduce programs, Pig scripts and Hive commands to deliver the best results.
- Worked on Bootstrap, Angular JS and Node JS, knockout, ember.js, Java Persistence Architecture (JPA).
- Architecting, Solutioning and Modeling DI (Data Integrity) Platforms using sqoop, flume, kafka, Spark Streaming, Spark Mllib, Cassandra.
- Knowledge on big-data database HBase and NoSQL databases Mongo DB and Cassandra.
- Expertise in core Java, J2EE, Multithreading, JDBC, Hibernate, Spring, Shell Scripting and proficient in using Java API's for application development.
- Having a significant experience in usage of Spring boot, Struts, Hibernate, Java server pages JSP, and other building tools such as ANT, MAVEN.
- Expert in Amazon EMR, Spark, Kinesis, S3, Boto3, Bean Stalk, ECS, Cloud watch, Lambda, ELB, VPC, Elastic Cache, Dynamo DB, Redshit, RDS, Aethna, Zeppelin & Airflow.
- Strong Experience in Front End Technologies like JSP, HTML5, JQuery, JavaScript, CSS3.
- Experienced in application development using Java, J2EE, JDBC, spring, Junit.
- Expertise in Data Development in Hortonworks HDP platform & Hadoop ecosystem tools like Hadoop, HDFS, Spark, Zeppelin, Hive, HBase, SQOOP, flume, Atlas, SOLR, Pig, Falcon, Oozie, Hue, Tez, Apache NiFi, Kafka.
- Good Knowledge in Amazon Web Service (AWS) concepts like EMR and EC2 web services which provides fast and efficient processing of Teradata Big Data Analytics.
- Experienced in collection of Log Data and JSON data into HDFS using Flume and processed the data using Hive/Pig.
- Strong knowledge and experience in Object Oriented Programming using Java.
- Experienced in managing and reviewing the Hadoop log files.
- Developed applications based on Model-View-Controller (MVC).
- Managed the project based on Agile-Scrum Methods.
- Have good experience, excellent communication and interpersonal skills which contribute to timely completion of project deliverable well ahead of schedule.
TECHNICAL SKILLS:
Hadoop/Big Data: Hive, Pig, HBase, Zookeeper, Map Reduce, HDFS, SQOOP, Oozie, Kafka, Storm, Flume, Scala.
Java/J2EE Technologies: JDBC, Java Scripts, JSP, Servlets, JQuery.
Languages: Java, J2EE, PL/SQL, Pig Latin, HQL, R, Python, XPath, Spark.
Web Technologies: JavaScript, CSS, XSLT, HTML, DHTML, XML, XHTML, Dynamo DB.
Databases: Oracle 12c/11g/10g, Microsoft Access, MS SQL.
Cloud Platform: AWS, Azure.
Web/Application servers: Jboss, Apache Tomcat 6.0/7.0/8.0.
Frameworks: MVC, Spring, Hibernate, Struts.
No SQL Databases: Mongo DB, Cassandra.
Operating Systems: UNIX, Ubuntu Linux, Windows, Centos, Sun Solaris.
Network protocols: TCP/IP fundamentals, LAN and WAN.
PROFESSIONAL EXPERIENCE:
Confidential, Long beach, CA
Sr. Big Data Developer
Responsibilities:
- Working using Apache Hadoop ecosystem components like HDFS, Hive, SQOOP, Pig, and Map Reduce.
- Working as a Hadoop consultant on (Map Reduce/Pig/HIVE/SQOOP).
- Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
- Working in querying data using Spark SQL on top of Spark engine.
- Involved in complete SDLC of project includes requirements gathering, design documents, development, testing and production environments. Packaged Spark development environment into custom vagrant box.
- Experience in AWS, implementing solutions using services like (EC2, S3,).
- Utilized NOSQL database HBase for loading HIVE tables into HBase tables through Hive-HBase integration which was consumed by Data scientist team.
- Coded the real-time with Spark Streaming and Apache Nifi to Store the data in Hive and HBase.
- Worked on setting up and configuring AWS's EMR Clusters and Used Amazon IAM to grant fine-grained access to AWS resources to users.
- Implement enterprise grade platform (mark logic) for ETL from mainframe to NOSQL (cassandra).
- Working with Struts MVC objects like Action Servlet, Controllers, validators, Web Application Context, Handler Mapping, Message Resource Bundles, and Form Controller.
- Created own shiny-server on Linux Centos OS and deployed reports on server.
- Working on processing big volumes of data using different big data analytic tools including Spark Hive, SQOOP, Pig, Flume, Apache Kafka, OOZIE, HBase, Scala.
- Extensively development experience in different IDE like Eclipse, Net Beans and IntelliJ.
- Developed REST API for test server for Express Router middleware based API's with Mongo DB integration. Strong Experience in developing user interfaces with HTML, DHTML, JSTL, XSD, XML and CSS. Worked with scripting languages like JavaScript, AJAX and JQuery.
- Exposed to Agile environment and familiar with tools like JIRA, Confluence.
- Sound knowledge in Agile methodology- SCRUM, Rational Tools.
- Lead architecture and design of data processing, warehousing and analytics initiatives.
- Used Apache Nifi for ingestion of data from the IBM MQ's (Messages Queue).
- Identify query duplication, complexity and dependency to minimize migration efforts Technology stack: Oracle, Hortonworks HDP cluster, Attunity Visibility, Cloudera Navigator Optimizer, AWS Cloud and Dynamo DB.
- Developed custom code to read the messages of the IBM MQ and to dump them onto the NifiQueues.
- Creation of a User Interface to search and/or view content within the cluster by using solar cloud.
- Worked on Sequence files, RC files, Map side joins, bucketing, partitioning for Hive performance enhancement and storage improvement.
- Developed dynamic JSP pages with Struts. Used built-in/custom Interceptors and Validators of Struts.
- Enable and configure Hadoop services such as HDFS, YARN, Hive, Ranger, Hbase, Kafka, Sqoop, Zeppeline Notebook and Spark/Spark2.
- Developed different kinds of interactive graphs in ER studios.
- Extensive experience in Spark Streaming (version 1.5.2) through core Spark API running Scala, Java to transform raw data from several data sources into forming baseline data.
- Creating dashboard on Tableau and Elastic search with Kibana.
- Hands on expertise in running the SPARK & SPARK SQL.
- Working on MapR Hadoop platform to implement Big Data solutions using Hive, Map Reduce, shell scripting, and java technologies.
- Using Struts (MVC) for implementation of business model logic.
Environment: JDBC, NOSQL, Mongo DB, Alteryx, Map Reduce, Spark, YARN, HIVE, Pig, Scala, Nifi, intellij, AWS EMR, Python, Dynamo DB, Hadoop, MYSQL.
Confidential, Malvern, PA
Big Data / Java Developer
Responsibilities:
- Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
- Created Schemas, Views and Tables using Hive and Impala. Also have worked in Data modeling, schema designing for no-sql (HBase) and impala tables.
- Worked on Spark Storm, Apache and Apex and python.
- Single View Of Product - Developed scripts using SQOOP, SCP & Hive to consolidate PCM & PSSA attributes of all product sold at Lowe's. Oozie coordinator is used for scheduling.
- Collected and aggregated large amounts of data from different sources through Apache Kafka.
- Developed the application by using the Spring MVC framework.
- Consolidation of Allied BU Sales, Inventory, customer, GL & other data - Developed data Ingestion pipeline using SQOOP & Falcon. Developed scripts using Bash, Spark, Hive, Pig. Data Visualization using MSTR VI
- Implementation of projects using Agile SCRUM product development strategy.
- Documented application for its functionality and its enhanced features.
- Using annotation based Spring MVC, Hibernet the Development and Design of end to end customer self - service is done.
- Developed a shell script to create staging, landing and Semantic tables with the same schema like the source.
- Developed HiveQL scripts for performing transformation logic and also loading the data from staging zone to landing zone and Semantic zone.
- Developed the Presentation and Controller layers using JSP, HTML, Java Script, Business layer using Spring (IOC, AOP), DTO, JTA, and Persistent layer DAO, Hibernate for all modules
- Component unit testing using Azure Emulator Analyze escalated incidences within the Azure SQL database.
- Responsible for Debug, Optimization of Hive Scripts.
- Set up and administered large clusters consisting of numerous shards/replica sets. Managed multiple MongoDB (NoSQL) database clusters.
- Collected XML and JSON data from different Sources and developed Spark APIs that helps to do inserts and updates in Hive tables and made data available in Hive as per business requirement.
- Importing/Exporting data from MYSQL /Oracle to Azure cluster using sqoop.
- Worked on Spark Storm, Apache and Apex and python.
- Sales Forecast & USHI metrics - Implemented Hive scripts for implementing sales metrics for reporting & forecasting. Oozie is used for scheduling.
- Developed Shell & Hive scripts for consolidating Brand View pricing data. Oozie is used for scheduling.
- Used Git for version controller.
- Use Spark API for Machine learning. Translate a predictive model from SAS code to Spark
- Documented application for its functionality and its enhanced features
- Develop DAO design pattern for hiding the access to data source objects.
- Involve in implementation of REST and SOAP based web services.
- Spring IOC being used to inject the parameter values for the Dynamic parameters.
Environment: Apache Hadoop, HDFS, NoSql, Mongo.db, Hive, Map Reduce, Pig, SQOOP, Kafka, Spark, Apache Cassandra, Oozie, Azure, Zookeeper, MySQL, Eclipse, PL/SQL and Python.
Confidential, Boston, MA
Sr. Java/J2EE Developer
Responsibilities:
- Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and also automated the whole process.
- Developed user interface using JSP, JavaScript, Ajax, JQuery, HTML, CSS and JSTL
- Participated in the process of SDLC, involved in analysis, design and implement of the system.
- Designed and developed the application using Java, JEE, and Spring Core.
- Proficient in TDD (Test Driven Development) by writing test cases using Jasmine, J-Unit Framework Working as a Full Stack developer using JavaScript frameworks Angular.JS and NodeJS.
- Designed and developed business applications using JSP, Servlet, JAVA, J2EE, Threads, Socket programming, EJB, XML, JNDI, Hibernate and JDBC technologies on Windows and UNIX platform.
- Experience working with big data and real time/near real time analytics and big data platforms like Hadoop, Spark using programming languages like Scala and Java.
- Involved Oracle FMW products (SOA Suite, OSB, ODI, Web Center, and WebLogic).
- Wrote programs in Scala using Spark and worked on migrating MapReduce programs into Spark using Scala.
- Used Spring Core Annotations for Dependency Injection and used Apache Camel to integrate Spring framework.
- Extensively involved in the development of backend logic or data access logic using Hibernate and thus creating the object relational mapping with the Java beans in the application.
- Worked on performance tuning of Web Agents, Policy Servers, Policy Stores and User Stores to meet and maintain operational requirements (process, thread, connection, cache).
- Deployed the application on Web Logic web servers, worked on Hibernet for developing persistent objects, worked on Aqua data studio (IDE).
- Integrated Automated functional tests (Groovy) with Continuous-Integration in Jenkins.
- Responsible for creating efficient design and developing User Interaction screens using HTML5, CSS3, JavaScript, JQuery, AJAX, Angular JS, and JSON Used jQuery, Ajax for service calls on pages to interact with the servers.
- Worked on importing impala to Python.
- The interfaces are built using Apache Camel framework and JMS.
- Implemented XML-based communications protocol using JAXB.
- Used Maven as the build tool, GIT for version control, Jenkins for Continuous Integration and JIRA as a defect tracking tool.
- Incorporated persistence tier using Hibernate framework.
- Planning and setting up of Continuous Integration for various properties on Jenkins with Commit, Component, Assembly, Deploy and Smoke jobs.
- Maven tool has been used for the build and deploy procedure scripts in UNIX, Linux and Windows environment using Hudson.
Environment: Java, J2EE, Java SE 6, UML, JSP 2.1, HBASE, JSTL 1.2, Servlets 2.5, Spring MVC, Hibernate, JSON, Restful Web services, jQuery, UNIX, AJAX, Angular Js, JAXB, IRAD, Web Sphere 7.0, Eclipse Kepler-Maven, Serena Dimensions, Unix, JUnit, DB2, Oracle.
Confidential, Wilmington, DE
Java/J2EE Developer
Responsibilities:
- Responsible for developing and implementing J2EE applications.
- Designed and developed front-end using Servlet, JSP, JSF, DHTML, Java Script and AJAX.
- Used all the advanced Photoshop features to create appealing visual web interfaces.
- Implemented various Search Engine Optimization techniques as metadata, building inbound link, and outbound link text, meaningful title while designing web pages.
- Used JQuery to make the HTML, DHTML, CSS and JBoss code interact with the JavaScript functions to add dynamism to the web pages at the client side.
- Created HTML, CSS, JavaScript, DHTML pages for Presentation Layer.
- Involved in developing of design documents with UML class diagrams.
- Developed the User Interactive web pages in a professional manner with using web technologies like HTML, XHTML, and CSS as per company’s standards.
- Created Graphic User Interface (GUI) and applied to web site.
- Designed and developed middleware application using spring core framework and Implemented Java EE components using Spring MVC, Spring IOC, spring Batch for cross cutting concerns and spring security modules.
- Increased developer productivity by using efficient programming methodologies and local development.
- Managed application state using server and client-based State Management options.
- Handled all aspects of the web application including maintaining, testing, debugging, deploying and printing.
- Involved in JavaScript coding for validations, and passing attributes from one screen to another.
- Applied client side validations using JavaScript, jQuery and Apache Struts.
- Implemented Batch jobs to deal with Large number of chunks using Spring Batch Framework to execute the similar jobs simultaneously.
- Built HTML and CSS system for controlling text display issues cross-platform and cross browser.
- Implemented AJAX and Maven to enhance the capability of the website by using Eclipse.
- Used Firebug and IE Developer Toolbar for debugging and browser compatibility.
- Implemented a controller Servlets/JSP for the security of the system.
- Developed dynamic page designing using JSP tags to invoke Servlets/ JSP Content is configured in XML Files.
- Consumed SOAP Web services, generate classes from XSD using JAXWS using MAVEN.
- Used MAVEN for project management and build automation with JBoss.
- Used CSS Blueprint to create grids and adopt cross browser interactive features.
Environment:, JSON, JSP, Maven 3, HTML, CSS, JavaScript, Spring Batch, JQuery, AJAX, XML, XHTML, DHTML, MySQL, Internet Explorer, Firefox, Chrome, Windows, Photoshop, Eclipse.
Confidential
Java/J2EE Developer
Responsibilities:
- Implemented Object-relation mapping in the persistence layer using hibernate frame work in conjunction with Spring Aspect Oriented Programming (AOP) functionality.
- Developed application framework using struts with design principles of the J2EE using Business Delegate, Service Locator, Session Facade, Domain object and DAO patterns and developed Stateless Session Bean to Achieve Session façade Design Pattern.
- Developed Stored Procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
- Developed SQL queries and Stored Procedures using PL/SQL to retrieve and insert into multiple database schemas.
- Help Devops teams configuring servers by building cook books to install and configure tomcat.
- Developed the XML Schema and Web services for the data maintenance and structures Wrote test cases in JUnit for unit testing of classes.
- Used DOM and DOM Functions using Firefox and IE Developer Tool bar for IE.
- Used JSP, HTML, Java Script, Angular JS and CSS3 for content layout and presentation.
- Did core Java coding using JDK 1.3, Eclipse Integrated Development Environment (IDE), clear case, and ANT.
- Developing User Interface Screens using Spring MVC, to enable customers obtain auto finance. Extensive experience in developing various web based applications using Hibernate 3.0 and Spring frameworks.
- Developed Spring REST Exception Mappers.
- Implemented functionality using Servlet, JSP, HTML and Struts Framework., Hibernate, Spring, Java Scripts, Hazelcast and Weblogic
- Developed Authentication layer using Spring Interceptors.
- Used Log4J to print the logging, debugging, warning, info on the server console.
- Build test cases using JUnit and carried out unit testing.
- Developed Spring REST Exception Mappers.
- Developed Stored Procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
- Responsible for deployment of application in the Integration/Functional Environment, providing necessary assistance to UA Tester.
- Implemented Hibernate ORM to Map relational data directly to java objects
- Developed application framework using struts with design principles of the J2EE using Business Delegate, Service Locator, Session Facade, Domain object and DAO patterns and developed Stateless Session Bean to Achieve Session façade Design Pattern.
- Developed Stored Procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
Environment: Java, XML, HTML, JavaScript, JDBC, UNIX, CSS, SQL, PL/SQL, XML, Web MVC, Eclipse, Ajax, JQuery, Spring with Hibernate, Active MQ, Jasper Reports, Ant as build tool and My SQL and Apache Tomcat.