Lead - Java Developer Resume
Grenville -, SC
SUMMARY:
- Having 8+ years of experience in Consulting, Analysis, Implementation of Java and Big Data solutions for various project assignments
- Lead contributor for Big Data Centre of excellence in various emerging technologies like Big Data, Texts analytics, No SQL databases and other related areas.
- Focus on designing and delivering most optimum and critical business solutions for Big Data Technologies.
- Keen in building knowledge on emerging technologies in the Analytics, Information Management, Big data, Data science and related areas and in providing best business solutions
- Experienced in individual phases of project lifecycle with emphasis on Planning, Designing and Coding.
- Experienced in Hadoop environment technologies/Platforms like Pig, Hive, Sqoop.
- Consulting for Java, Big Data Technologies. Evaluation of new technologies in Big data, Analytics and NO Sql space
- Exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop, HBase, Cassandra and Oozie.
- Extensive Experience in Setting Hadoop Cluster.
- Good working knowledge with Map Reduce and Apache Pig.
- Involved in writing the Pig scripts to reduce the job execution time.
- Having experience with processing real time streamed data using Strom and Spark streaming
- Experience with configuration of Hadoop Ecosystem components: Map Reduce, Hive, Hbase, Pig, Sqoop, Oozie, Flume, Storm, Spark, Yarn and Tez.
- Have executed projects using Java/J2EE technologies such as Core Java, Servlets, Jsp,JDBC, Ext JS, Struts.
- Experience in application development frameworks like spring, Hibernate and also on validation plug - ins like Validator frameworks.
- Strong experience with version control tools such as Subversion, Clear Case, and CVS.
- Experienced in Developing J2EE Application on IDE tools like Eclipse and Net Beans.
- Expertise in build scripts like ANT and Maven and build automation.
- Strong Experience in working with Databases like Oracle, SQL Server, EDB and proficiency in writing complex SQL, PL/SQL for creating tables, views, indexes, stored procedures and functions.
- Experience with all stages of the SDLC and Agile Development model right from the requirement gathering to Deployment and production support.
- Also have experience in understanding of existing systems, maintenance and production support, on technologies such as Java, J2EE and various databases (oracle, SQL Server).
- Highly Capable in learning things quickly and good at good time management.
- Excellent communication skills, interpersonal, hardworking and ability to proficiently communicate with all levels of the organization and work as a part of the team as well as independently.
- Knowledge on FLUME, NO-SQL, Spark Ecosystem(Spark Core, SQL, Streaming, ML Lib, Graph X) Data warehouse and BI technologies
TECHNICAL SKILLS:
Languages: Java, J2EE, SQL
Hadoop Distributions: Apache Hadoop, MapR, Hortonworks, Cloudera
Big Data Technologies: Hadoop, HDFS, MR, Hive, Pig, Sqoop, HBase, Cassandra Flume, Oozie, Spark, Kafka, Storm
J2EE Technologies: JSP, Servlets, JDBC
Web and Application Servers: JBoss 8.2, BEA Web Logic and Tomcat
Frameworks: Struts, Hadoop, Ext JS, Spring, Hibernet.
Java IDEs: Eclipse and My Eclipse, Net Beans.
Tracking Tools and Version Control: Jira and SVN
Databases: Oracle 11g, SQL Server 2008 R2, EDB(Postgre SQL),My SQL.
NO SQL: HBase, Cassandra
Operating Systems: Windows XP/7, Linux, Red Hat, Cent OS, Ubuntu
PROFESSIONAL EXPERIENCE:
Confidential, Grenville - SC
Lead - Java Developer
Responsibilities:
- Involved in Use Case meeting to understand and analyze the requirements, Coded as per Prototype.
- Developed various UI (User Interface) components using Struts (MVC), JSP, and HTML.
- Developed Controllers, created JSPs and configured in Struts -config.xml, Web.xml files.
- Developed MVC architecture, Business Delegate, Service Locator, Session facade, and Data Access Object and Singleton patterns
- Involved in writing all client side validations using Java Script, JSON.
- Involved in the complete development, testing and maintenance process of the application.
- Used Hibernate 2.0 as the ORM tool to communicate with the database.
- Designed and created a web-based test client using Struts up on client’s request, which is used to test the different parts of the application.
- Involved in writing the test cases for the application using JUnit.
- Used extensive JSP, HTML, and CSS to develop presentation layer to make it more user friendly.
- Involved in different Testing phases like Unit Test, Integration Test and Regression Test.
- Involved in Development process and have knowledge in usage of Tracker Tools like JIRA.
- Developed back-end stored procedures and triggers using Oracle PL/SQL, involved in database objects creation, performance tuning of stored procedures, and query plan
- Responsible for developing and maintaining all the session beans.
- Supported the application through debugging, fixing, production support and maintenance releases.
- Worked closely with the client and the offsite team; coordinated activities between them for effective implementation of the project.
- Involved in Restful Web services with JSON using Jackson API,
- Involved in Web services(SOAP,RESTfull) Testing using Confidential EAM WebService tool kit
Environment: J2SE, JSP, Servlets, Struts, EJB2.0, Ext JS, XML, Oracle 11g, Postgresql, Web Service, SQL Server 2008R2, Eclipse, TOAD, JIRA, SVN, Tortoise, Log4j.
Confidential - Web Intelligence - MN
Hadoop Developer
Responsibilities:
- Moving data from Oracle to HDFS and vice-versa using SQOOP.
- Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
- Implemented Hive tables and HQL Queries for the reports
- Worked with different file formats and compression techniques to determine standards importing and exporting data into HDFS and Hive using Sqoop
- Developed Hive queries to analyze/transform the data in HDFS.
- Designed and Implemented Partitioning (Multi-level), Buckets in HIVE.
- Designed and implemented pig UDFs for evaluation, filtering, loading and storing of data
- Analyzing/Transforming data with Hive and Pig.
- Creating Views on Hive tables.
- Effective coordination with offshore team and managed project deliverable on time.
- Worked on QA support activities, test data creation and Unit testing activities.
- Responsible for creating Hive tables, loading the structured data resulted from Map Reduce jobs into the tables and writing hive queries to further analyze the logs to identify issues and behavioral patterns.
- Used Hive to analyses data ingested into Hbase by using Hive-Hbase integration and compute various metrics for reporting on the dashboard.
- Developed job flows to automate the workflow for pig and hive jobs.
- Extensively involved in performance tuning of Oracle queries.
- Loaded the aggregated data onto Oracle from Hadoop environment using Sqoop for reporting on the dashboard.
- Written UNIX scripts to automate batch functions.
- Used Tableau for visualization on processed data.
Environment: Hadoop Ecosystem (HortonWorks 2.x), Apache Pig, Hive, SQOOP, Oozie, Platfora, HCat, Java, UNIX Scripts, Oracle, Cent OS, Map Reduce, Hbase, Cassandra, Tableau.
Confidential - AZ
Hadoop Developer
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS and developed multiple Map Reduce jobs in Java for data cleaning and preprocessing
- Collaborate with subject matter experts, various stakeholders and fellow developers to design, develop, implement and support data analytics.
- Importing and exporting data into HDFS and Hive using Sqoop, Spark Core and Spark SQL
- Created Hive tables, and loading and analyzing data using hive queries
- Having exposure to Teradata for processing the huge data
- Worked on debugging, performance tuning of Hive & Pig Jobs.
- Designed and implemented pig UDFs for evaluation, filtering, loading and storing of data
- Worked on Performance Tuning of Hadoop jobs by applying techniques such as MapSide Joins, Partitioning, Bucketing and using different file formats such as SequenceFile, RCFile, ORCFile
- Defined job work flows as per their dependencies in OOZIE.
- Used JAVA, J2EE application development skills with Object Oriented Analysis and extensively involved throughout Software Development Life Cycle (SDLC)
- Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
- Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
- Load and transform large sets of structured, semi structured and unstructured data
- Supported Map Reduce Programs those are running on the cluster
- Involved in loading data from UNIX file system to HDFS, configuring Hive and writing Hive UDFs
- Processing the streamed data using real time messaging systems Kafka and Strom
- Utilized Java and ORACLE from day to day to debug and fix issues with client processes
- Managed and reviewed log files
- Having good amount of knowledge on Cassandra, HBase
- Implemented partitioning, dynamic partitions and buckets in HIVE
Environment: Hadoop, JDK1.6, Map Reduce, HDFS, Hive, Strom, Kafka, Cassandra, Pig, Spark core, Spark SQL, Sqoop, Flume, HTML, XML, SQL, J2EE, Eclipse, RC, ORC, Flume, Thrift, Oozie, HBase
Confidential
Java Developer
Responsibilities:
- Functional and UI design has been prepared
- Implementation at BIO level
- Creation of Record sets and BIOs for the database schema
- Created Relationships for data Integrity
- Created Lookups and attribute domains
- Implementation at UI level
- Menus for Navigation
- Forms for various Perspectives
- Implemented shells like List Shell, Detail Shell, Tab Group Shell, Toggle Shell to
- Provide better look and feel
Environment: CRB Studio, Web logic server 8.1, LDAP, Core Java, SQL Server
ConfidentialJava Developer
Responsibilities:
- Involved in Use Case meeting to understand and analyze the requirements, Coded as per Prototype.
- Involved in product development and customizations using Confidential CRB studio Designing and Development of BIO s, record sets, data sources, forms, shells, navigations
- Involved in Coding and bug fixing.
- Used Validate plug - in of struts framework to handle server side validations.
- Involved in usage of SVN for version control process.
- Logging of errors in application is achieved by using Log4J API.
- Tables are created by running Scripts on SQL Developer IDE.
- Involved in Build, Deployment Activities and Oracle/SQL Database Schema Restore and backup.
- Involved in Unit Test, Integration Test and Regression Test plans.
- Involved in Development process and have knowledge in usage of Tracker Tools like JIRA.
- Having good Knowledge in Epiphany Platform (Open Architecture).
Environment: J2SE, JSP, Servlets, Struts, Spring, Hibernate, Java Beans, XML