Java Developer Resume
Atlanta, GA
SUMMARY
- Around 8 years of professional IT experience which includes around 4+ years of experience in Big data ecosystem related technologies.
- Excellent understanding / knowledge of Hadooparchitecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode, Pig, Hive, Sqoop, Oozie, HBase, Yarn and MapReduce programming paradigm.
- Well versed with installation, configuration, managing and supporting Hadoop cluster using various distributions like Apache Hadoop, Cloudera - CDHand Hortonworks HDP, AWS.
- Providing security for Hadoop Cluster with Kerberos, Active Directory/LDAP, and TLS/SSL utilizations and dynamic tuning to make cluster available and efficient.
- Familiarity with monitoring tools like Ganglia, Nagios forHadoopcluster monitoring.
- Programmed ETL functions between Oracle, MySQL, Hadoop, Hive, Amazon Redshift and Amazon S3.
- Experience working with Cassandra and NoSQL database including MongoDB and Hbase.
- Experience in developing NoSQL database by using CRUD, Sharding, Indexing and Replication.
- Experience in developing Pig scripts and Hive Query Language.
- Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
- Used Zookeeper to provide coordination services to the cluster.
- Experienced using Sqoop to import data into HDFS from RDBMS and vice-versa.
- Involved with Talend for inbound/outbound activities.
- Strong working knowledge onsparkstreaming, RDD's,SparkSQL and Scala
- Creating and Deployingsparkjobs in YARN environment
- Dealing with Data transfers thoroughSparkand Map-reduce from and semi, structured data sets.
- Working on Kafka for messaging system, and to do the data streaming analysis
- Responsible for Data Ingestion like Flume andKafka.
- Experience on Kafka, Kafka-Mirroring to ensure that the data is replicated without any loss.Set up Camus for reading the data fromKafka and storing it in HDFS.
- Designed and implemented a stream filtering system on top of Apache Kafka to reduce stream size.
- Experience in OLAP and ETL/Data warehousing, creating different data models and maintaining Data Marts.
- Sound knowledge of Business Intelligence and Reporting. Preparation of Dashboards using Tableau.
- Expertise in providing business intelligence solution using Informatica andTeradata in data warehousing systems.
- Very strong conceptual and hands on programming skills on Threads, Multi-threading, Garbage Collections, exceptional handling, memory management, and OOPS Concepts in Core Java.
- Extensive experience in J2EE technologies such as JDBC, JSP, Servlets, JSF, EJB, RMI, JNDI, Struts, Spring, Hibernate,Javastandard tag library, custom taglibraries.
- Experience in web development using HTML, CSS, JavaScript technologies.
- Designed, Developed and implemented Java Web Services using XML, SOAP, REST, WSDL, Algorithms and UDDI.
- Strong working knowledge of SQL, PL/SQL, Stored Procedures, joins and triggers with databases like Oracle, DB2, and MS SQL Server.
- Developerfor database development and interaction, using IDE's such as Eclipse and Net Beans.
- Good knowledge and experience utilizing Agile-Scrum and Waterfall methodology
- Experiences in all phases of the software development lifecycle: Concept, Design, Development, QA, Rollout and Enhancements.
- Strong team building, conflict management, time management and meeting management skills.
TECHNICAL SKILLS
Big data: Hadoop, Map Reduce, HDFS, Hive, HBase, Pig, Sqoop, Flume, Oozie, Zookeeper, Flume, Netezza, Mahout, YARN, Storm, Spark, Kafka(0.8.2x, 0.9.0), Mongo DB, Cassandra.
HadoopDistributions: Cloudera, Hortonworks, MapR
Core Skills: Core Java (OOPs and collections), J2EE Framework, JSP, Servlets, Oracle ADF, JSF, Linux Shell Script, JDBC, Scala
Databases: Oracle, SQLServer
Design Patterns: Singleton, Factory, MVC
Build Tools: ANT, Maven
Browser Scripting: Java script, HTML DOM, DHTML, AJAX, AngularJS
IDE: Eclipse/My Eclipse, JDeveloper
Operating Systems: Red-hat Linux, Windows, Linux, UNIX
PROFESSIONAL EXPERIENCE
Hadoop Developer/Admin
Confidential, Atlanta, GA
Responsibilities:
- Experience with Cloudera distribution of Hadoop.
- Installed/Configured/Maintained Apache Hadoop clusters for application development andHadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
- Deployed Hadoop Cluster in the following nodes, Managing and scheduling jobs on Hadoop cluster.
- Involved in analyzing system failures, identifying root causes and recommended course of actions.
- Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.
- Handled importing of data from various data sources, performed transformations using Hive, Pig and Spark and loaded data into HDFS.
- Wrote queries to create, alter, insert and delete elements from lists, sets and maps in Datastax Cassandra.
- Involved in NoSQL (Datastax Cassandra) database design, integration and implementation.
- Developed multiple MapReduce jobs in java for data cleaning and accessing.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Created indices for conditioned search in Datastax Cassandra.
- Worked on the core and Spark SQL modules of Spark extensively.
- Implemented Name Node backup using NFS. This was done for High availability.
- Worked on importing and exporting data from Oracle and DB2 into HDFS using Sqoop.
- Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS.
- Worked on custom Pig Loaders and Storage classes to work with a variety of data formats such as JSON, Compressed CSV, etc.
- Monitored workload, job performance and capacity planning using Cloudera Manager.
- Created Hive External tables and loaded the data in to tables and query data using HQL.
- Wrote shell scripts to automate document indexing to Solr Cloud in production.
- Created Talend Mappings to populate the data into dimensions and fact tables
- Developed jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency in Talend.
- Created Hbase tables to store various data formats of PII data coming from different portfolios.
- Cluster co-ordination services through Zookeeper.
- Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
- Worked for Amazon Elastic Cloud project using Agile methodology
- Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page-views, visit duration, most purchased product on website.
- Converting the Oracle table components to Teradata Table Components in Abilities Graphs.
- Used Ambary to manage, provision and monitorHadoopcluster.
- Implemented Fair schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users.
Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Spark, AWS, Cassandra, Pig, Sqoop, Oozie, Zookeeper,Ambari, Storm, Teradata, Oracle, NoSQL, Elastic Search,Oozie, Hbase,Talend.
Hadoop Developer
Confidential, Cincinnati, OH
Responsibilities:
- Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex.
- Involved in loading data from LINUX file system to HDFS
- Develop HIVE queries for the analysis, to categorize different items.
- Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
- Given POC of FLUME to handle the real time log processing for attribution rep
- Exported the resulted sentiment analysis data to Tableau for creating dashboards.
- Used Map Reduce JUnit for unit testing.
- Monitored System health and logs and respond accordingly to any warning or failure conditions.
- Installed and configured HadoopMapReduce and HDFS.
- Acquired good understanding and experience of No SQL databases such as HBase and Cassandra.
- Installed and configured Hive and also implemented various business requirements by writing HIVE UDFs.
- Responsible to manage the test data coming from different sources.
- Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs.
- Created and maintained Technical documentation for launchingHadoop Clusters and for executing Hive queries and Pig Scripts.
- Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.
- Involved in various teams on and offshore for understanding of the data that is imported from there source.
Environment: ApacheHadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java.
Sr. Java Developer
Confidential, Dallas, TX
Responsibilities:
- Designed and developed the application using Agile methodology.
- Implementation of new module development, new change requirement, fixes the code.
- Defect fixing for defects identified in pre-production environments and production environment.
- Wrote technical design document with class, sequence, and activity diagrams in each use case.
- Involved in developing XML compilers using XQuery.
- Developed the Application using Spring MVC Framework by implementing Controller, Service classes.
- Involved in writing Spring Configuration XML file that contains declarations and other dependent objects declaration.
- Used Hibernate for persistence framework, involved in creating DAO's and used Hibernate for ORM mapping.
- Written Java classes to test UI and Web services through JUnit.
- Performed functional and integration testing, extensively involved in release/deployment related critical activities.
- Responsible for designing Rich user Interface Applications using JSP, JSP Tag libraries, Spring Tag libraries, JavaScript, CSS, HTML.
- Used SVN for version control. Log4J was used to log both User Interface and Domain Level Messages.
- Used Soap UI for testing the Web services.
- Use of MAVEN for dependency management and structure of the project.
- Create the deployment document on various environments such as Test, QC, and UAT.
- Explored SpringMVC, Spring IOC, Spring AOP, Hibernate in creating the POC.
- Done data manipulation on front end using JavaScript and JSON.
- Involved in Detail level design and coding activities at offshore.
Environment: Java, J2EE, JSP, Spring, Hibernate, CSS, JavaScript, Oracle, Jboss, Maven, Eclipse, JUnit, Log4J, AJAX, Web services, JNDI, JMS, HTML, XML, XSD, XML Schema
Java/J2EE Developer
Confidential
Responsibilities:
- Involved in multi-tiered J2EE design utilizing Spring framework and JDBC.
- System was built using Model-View-Controller (MVC) architecture.
- Designed the front end using HTML, CSS, Java Script, JSP, jQuery.
- Designed and implemented the application using Spring MVC, JDBC, MYSQL.
- Used SVN version control tool.
- Automated the build process by writing Maven build scripts
- Wrote SQL queries, stored procedures, modifications to existing database structure as required for addition of new features using MySQL database
- Involved in installing and configuring Eclipse for development
- Configured and customized logs using Log4J and unit testing using Junit.
- Developed JUnitTest cases and performed application testing for QC team.
- Used JavaScript for client side validations.
- Participated in weekly project meetings, updates and Provided Estimates for the assigned Task.
Environment: Java, J2EE, JavaScript, JDBC, Spring,ASP.NET, VB.NET, AGILE - SCRUM, JSP, Servlet, XML, Design Patterns, Log4J, JUnit, SVN, MySQL, Eclipse.
Java Developer
Confidential
Responsibilities:
- Extensively involved in product development with Core Java.
- Analysis and documentation of this feature, iterative method incorporated.
- IBM rational rose tool used in designing SFR.
- Effectively designed and implemented the feature resulting in minimum bugs.
- Worked extensively on concepts like design patterns, UML, OOAD, OOPS.
- Implemented design patterns like singleton, factory.
- Unit testing using JUnit and overall testing.
- Documented Unit test cases and other details in wiki.
- Authored technical details and FAQs for the features (wiki).
- Code optimization, creating a better performing application.
- Performed extensive peer code-review.
Environment: Java, J2EE, Design Patterns, OOPS/OOAD (UML), XML, Eclipse IDE, Idea IDE, perforce source control, IBM rational rose, Iterative development.