Big Data/java Developer Resume
Hoffman Estates, IL
SUMMARY
- Over 9 years experience in software design and development using HadoopBigDataEcho Systems and Core JAVA/J2EE with Telecommunication,Ecommerce and Finance and HealthCare domains.
- Extensive experience in Analyzing, Designing, Developing, Implementing and Testing of Software Applications in HDFS,YARN,MapReduce,PIG, HIVE,Impala, Sqoop, Spark, HBase, MongoDB,Kafka, AZKABAN, Zookeeper and Oozie.
- Extensive experience in software design and development using CoreJAVA, SPRINGCore, SpringAOP, SpringHibernate, Struts, JDBC, XML in UNIX operating system.
- Excellent understanding of Hadooparchitecture and different components of Hadoop clusters which include componenets of Hadoop (Job Tracker, Task Tracker, Name Node and Data Node).
- Involved in Data Ingestion to HDFS from various data sources(MySql, DB2, Mainframe, Teradata).
- Involved in loading data from LINUX file system to HDFS.
- Programmed for processing and generatingbig data sets with aparallel, distributed algorithm on acluster using Map Reduce.
- Executed HIVECommands for reading, writing, and managing large datasets residing in distributed storage usingMySql.
- Importing and exporting the data from relational databases, NOSQLDB’S using Sqoop.
- Analyzed large data sets by running Hive queries and Pigscripts.
- Having a good experience in writing the pig scripts.
- Involved in Optimization of HiveQueries.
- Analyzed larger sets of data representing them as data flows, applied pig filters to process the data and stored in HDFS for further process using PIGLatin script.
- Automated sqoop,hive and pig jobs using Oozie scheduling.
- Experience in designing and developing POCs using Spark to compare the performance of Spark with Hive and PIG in data processing time.
- Managing large set of hosts, Co - coordinating and managing a service in a distributed environment using Zookeeper.
- To improve flexibility, great performance and for big data horizontal scaling, used column based HBASE NOSQL database.
- Written API to store documents using mongo NOSQL database.
- Ability to analyze different file formats avro and Parquet
- Having good exposure in cluster maintenance.
- Involved in configuration and deployment modules.
- Having a good knowledge in OOZIE workflows.
- Have good knowledge on writing and using the user defined functions in HIVE and PIG.
- Configured & deployed and maintained multi-node Dev and Clusters.
- Developed multiple Kafka Producers and Consumers from scratch as per the business requirements.
- Working experience on TIBCO Business studio, M2E, Jenkins.
- Strong experience in Software Development Life cycle (SDLC) methodologies like Agile (TDD, SCRUM and XP). Also have experience in using Waterfall methodology.
- Strong Experience in implementing Service Oriented Architectures (SOA) using XML based Web Services (SOAP/WSDL/REST) and XML technologies such as XSD, and XSLT.
- Experienced in writing Oracle, SQL, PL/SQL procedures, Triggers in Oracle, and Stored Procedures in MySQL.
- Experience working on multiple projects and handling team in a productive way in all situations
- Experience in Using Ant build tools and performing Unit test.
- Putty is a tool which is used to invoke commands on a remote system over SSH, Telnet.
- Experience in handling the complete software development Impact Analysis, Program specification, preparation, Code review, Unit testing, Integration Testing.
- Experienced in using source code change management and version control tools Subversion, CVS.
- Design and code from specifications, analyzes, evaluates, tests, debugs, documents, implements complex apps.
- Good experience in working, managing and coordinating with offshore teams.
- Well-regarded communication skills, resourcefulness, and personal presentation.
- Excellent team player with the ability to create an environment of trust and cooperation through an open exchange of ideas towards achieving team goals
- Good verbal and written communication skills. Ability to deal with people diplomatically.
TECHNICAL SKILLS
Web/Application Servers: Apache Tomcat 5.x/6.x, IBM WebSphere 7.0/7.5/8.5, WebLogic
Java/J2EE Technologies: Core Java, EJB 3.0, Servlets, JSP, JDO, JPA, JSTL, JDBC, MVC, Struts, Hibernate 2.0/2.3/3.0, Spring, LOG4J, JNDI, JUNIT, JAXB, SAX/DOM, ANT, Jenkins, Maven
Messaging Systems: JMS, IBM MQ
Languages: Java, C, C++, SQL, PL/SQL, Shell, Python
WebTechnologies/Web Services: JavaScript, CSS, HTML, DHTML, XML (SAX, DOM Parser), JSONAltova XMLSpy, XSL, XSLT, SOAP, REST, WSDL
Databases: Oracle 8i/9i, 10g,11g, SQL Server 2008/2005/2000, MySQL, HBASE and MongoDB
Operating Systems: Windows, UNIX, LINUX.
Design Tools: UML, Rational Rose, IBM Requisite pro
BIG DATA/HADOOP ECO System: Hadoop, HDFS, Map Reduce, Pig, Hive, HBase, Sqoop, Zoo Keeper, SPARK, OZIE, AZKABAN
IDE Tools: RAD, Eclipse
Version Control Tools: CVS, SVN, GIT
Agile PM Tools: CA Rally, IBM TDP
Testing Tools: JUnit
SQL Tools: TOAD
Deployment Containers: Docker, Kubernetes
Change/DefectTracking Tools: Itrack, ALM QC
PROFESSIONAL EXPERIENCE
Confidential, Hoffman Estates IL
Big Data/Java Developer
Responsibilities:
- Responsible for building Scalable distributed data solutions using Hadoop.
- Gathered business requirements from the business partners and subject matter experts.
- Wrote multiple Map Reduce programs for data Extraction, Transformation and Aggregation from multiple file formats including XML, JSON, CSV and another compressed file format.
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
- Developed PIG UDFs to provide Pig capabilities for manipulating the data according to business requirements and worked on developing custom PIG Loaders.
- Implemented various requirements using Pig scripts.
- Loaded and transformed large sets of structured, semi structured, and unstructured data.
- Expert in implementing advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Spark.
- Computed the complex logics and controlled the data flow through the in-memory process tool and Apache Spark.
- Generated Scala and Java classes from the respective APIs so that they could be incorporated in the overall application.
- Wrote Scala classes to interact with the database.
- Worked with different file formats like TEXTFILE, AVROFILE, ORC and PARQUET for Hive querying and processing.
- Wrote Scala test cases to test Scala codes.
- Implemented functionalities using machine learning tools like Mahout to display the products best suited user profiles by performing sentiment analysis and trend analysis of the products.
- Worked on data loading into Hive for data ingestion history and data content summary.
- Developed Hive UDFs for rating aggregation.
- Developed HBase Java client API for CRUD Operations.
- Created Hive tables and was involved in data loading and writing Hive UDFs.
- Experience managing and reviewing Hadoop log files.
- High sequential throughput of spinning diskswas achieved through Kafka.
- Generated Java APIs for retrieval and analysis on NoSQL database such as HBase and Cassandra.
- Worked extensively with Sqoop to move data from DB2 to HDFS.
- Collected the logs data from web servers and integrated in to HDFS using Flume.
- Provided ad-hoc queries and data metrics to the business users using Hive and Pig.
- Facilitated performance optimizations such as using distributed cache for small datasets, partition and bucketing in Hive and completed map side Joins.
- Real time analysis along with continuous operation management was processed by Storm.
- Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop for analysis, visualization and to generate reports.
- Worked on NoSQL databases including HBase and Cassandra.
- Scheduled Oozie workflow engine to run multiple Hive and Pig jobs, which independently run with time and data availability.
- Worked on custom Pig Loaders and Storage classes to work with a variety of data formats such as JSON and Compressed CSV.
- Involved in running Hadoop streaming jobs to process Terabytes of data.
- Used TDP, Rally, JIRA for bug tracking and CVS for version control.
Environment: Hadoop, Map Reduce, Hive, HDFS, PIG, Sqoop, Oozie, Cloudera, Flume, HBase, CDH5, Cassandra, Oracle, J2EE, Oracle/SQL, DB2, Unix/Linux, JavaScript, Ajax, Eclipse IDE, RALLY, TDP, JIRA
Confidential, Hoffman Estates IL
Hadoop/Java Engineer
Responsibilities:
- Involved requirements gathering, understanding, and grooming with the team.
- Developed Mobile ManagerUser interface (UI) which logs all the transaction flow between different interfaces.
- Used JSP, JSP Tag Libraries (JSTL), Core Java, Custom Tag Libs, Java Script, XML, and HTML for developing Mobile Manager.
- Implemented Struts Framework for the front end and used Action Controller, Action Form, Validation Frame Work, Struts-Tiles, Struts Tag Libraries, Look up Dispatch Action for the presentation logic.
- Developed the application in Eclipse Environment.
- Responsible for designing XML, XSD, XSLT using Altova XML tool.
- Developed Web services using SOAP protocol for transferring data between different applications.
- Used Log4j for logging purposes.
- Cron/ Quartz schedulers are setup to fire web service URL’s Confidential regular intervals.
- Hibernate 3.0 was used to develop persistence layer. Custom DAOs were developed to retrieve the records from ORACLE database.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Involved in developing Pig Scripts for change data capture and delta record processing between newly arrived data and already existing data in HDFS.
- Involved Implementing POCs using Spark shell commands to process the huge data and compare the process time.
- Involved in creating Hive tables loading data and writing queries that will run internally in MapReduce way.
- Used Pig tool to do transformations, event joins, filter, and some pre-aggregations.
- Involved in processing ingested raw data using MapReduce, Apache Pig and HBase.
- Involved in developing Pig Scripts for change data capture and delta record processing between newly arrived data and already existing data in HDFS.
- Communication between different applications is maintained using JMS MQ and web Service calls.
- SVN version control system has been used to check-in and check-out the developed artifacts.
Environment: Java/J2EE, JSP, XML, HTML, Eclipse, JMS, Struts, Log4j, XSD, XSTL, SOAP Web Services, Hibernate, SPARK, Map Reducer, PIG Scripts, Azkaban, HBASE, JAVA7, REST Services.
Confidential, Hoffman Estates, IL
Senior Java Developer
Responsibilities:
- Worked with end customer to analyze requirements.
- Designed, developed, tested, documented, and maintained software applications.
- Involved in complete development lifecycle using Agile Development Methodology/SCRUM and tested the application in each iteration.
- Developed the server-side RESTful web service APIs to process requests from iOS and android mobile clients.
- Developed the Java service layer using Java 8, Jersey, Jackson, JSON, Spring framework.
- Developed the persistence layer using JDBC, Hibernate ORM and Oracle RDBMS.
- Designed and developed unit, functional, integration and regression testing using Junit, and Spring test framework.
- Built and deployed the packaged application to WebLogic application server using Maven, Jenkins and eclipse.
- SQL queries and PL/SQL SP.
- Implemented messaging system using JMS, MQ and Spring.
- Followed software engineering best practices and OOA/D.
- Developed stored procedures and prepared statements for updating and accessing data from database.
- Wrote shell and python scripts to analyze logs and create test data.
- Maintained documentation using swagger API and twiki.
- Participated in code review using SourceTree and git.
- Trained and mentored junior developers.
Environment: Core Java, Java 8, Spring framework, Hibernate, Maven, Eclipse, Jersey, Jackson, JMS, MQ, JAXB, JSON, JIRA, Git, Source Tree, SQL, Oracle, Junit, Jenkins, WebLogic, Ubuntu.
Confidential, Omaha, NE
Software Engineer
Responsibilities:
- Involved in the analysis, design, and development and testing phases of Software Development Lifecycle (SDLC).
- Designed Use Case Diagrams, Class Diagrams and Sequence Diagrams and Object Diagrams, using Microsoft Visio to model the detail design of the application.
- Used Jakarta Struts Framework for rapid development and ease of maintenance.
- Developed the application front end using Jakarta Struts Framework. Developed action classes, form beans and Java Server Pages using WSAD.
- Developed a web-based reporting for credit monitoring system with HTML, JSTL and custom tags using Struts framework.
- Developed Session beans which encapsulate the workflow logic.
- Wrote SQL queries to load, retrieve and update the date into database.
- Used VB/COM, ASP for developing support Webpages application.
- Designed and implemented Business Delegate, Session Facade and DTO Design Patterns.
- Involved in implementing the DAO pattern for database access and used the JDBC API extensively.
- Used XML Web services for transferring data between different applications and retrieving credit information from the credit bureau.
- Used JAXB API to bind XML schema to java classes.
- Used JMS-MQ Bridge to send messages securely, reliably and asynchronously to WebSphere MQ, which connects to the legacy systems.
- Used JCAPS for system Integration.
- Tested the application functionality with JUnit Struts Test Cases.
- Developed logging module-using Log4J to create log files to debug as well as trace application.
- Used CVS for version control.
- Responsible for supporting application in different phases from QA to production and fix QC bugs.
- Extensively used ANT as a build tool.
- Used shell script for batch processing on Unix environment.
- Deployed the applications on IBM WebSphere Application Server.
Environment: WAS 7.0, WSAD 5.1.2, WebSphere MQ, Java SDK 1.4, Hibernate 3.0, struts 1.2.4, Servlet 2.2, JSP 2.0, JNDI, JDBC, SQL, PL/SQL, XML Web Services, Core Java, VB/COM, ASP, Spring1.0.2, SOAP, WSDL, JavaScript,Windows, Oracle 9i, JUnit, JCAPS, CVS, ANT 1.5 and Log4J.
Confidential
J2EE Designer/Developer
Responsibilities:
- Involved in analysis, design and development of CDF Processing system and developed specifications that include Use Cases, Class Diagrams, Sequence Diagrams,and Activity Diagrams.
- Developed the application using Struts Framework that leverages classical Model View Controller (MVC) architecture.
- Deployed the applications on BEA WebLogicApplication Server.
- Used AJAX Framework for Dynamic Searching of covered products for funding.
- Involved in designing the Graphical User Interfaces using HTML, JSP, AJAX4JSF and JSF.
- Used Hibernate in data access layer to access and update information in the database.
- Used JNDI for naming and directory services.
- Developed the web application by integrating JSFICEFaces employing AJAX Confidential client - side components enabling rich web applications without any separate plugins.
- Used Web services - WSDL and SOAP for getting credit card information of the insured patients from third party.
- UsedsoapUIfor load testing the Web Services.
- Used XML parser APIs such as JAXP and JAXB in the web service's request response data marshaling as well as unmarshalling process.
- Wrote SQL queries and PL/SQL SP.
- Implemented JMS Listeners and Message Driven Beans (MDB).
- Wrote Shell scripts for Schedulers on UNIX environment.
- Developed JUnit test cases for all the developed modules.
- Used Log4J to capture the log that includes runtime exceptions, monitored error logs.
- Used CVS for version control across common source code used by developers.
- Used Maven scripts to build the application and deployed on BEA WebLogic Application Server.
- Designed database and normalization for databases in Oracle 10g and used the Stored Procedures and Triggers in the application.
Environment: Core Java, JDK 1.6, JSF ICE faces, Hibernate 3.6, JSP, Servlets, JMS, XML, SOAP, WSDL, JDBC, JavaScript, HTML, JNDI, CVS, Log4J, Eclipse Indigo, BEA WebLogic Application Server, Rational Rose for UML modeling, JUnit, SQL, Oracle 10g.