We provide IT Staff Augmentation Services!

Core Java Developer Resume

Phoenix, Az

SUMMARY:

  • 8+ years of comprehensive IT experience in BigData and BigData Analytics, Banking, Insurance, and Energy.
  • Substantial experience writing MapReduce jobs in Java, Pig.
  • Experience in working with Java, C++ and C.
  • Hands on experience in installing, configuring and using ecosystem components like Hadoop MapReduce, HDFS, Hbase, Zoo Keeper, Oozie, Hive, Cassandra, Sqoop, Pig, Flume.
  • Extensive experience in SQL and NoSQL development.
  • In-depth understanding of Data Structure and Algorithms.
  • Experience in deploying applications in heterogeneous Application Servers TOMCAT, Web Logic and Oracle Application Server.
  • Worked on Multi Clustered environment and setting up Cloudera Hadoop echo System.
  • Background with traditional databases such as Oracle, Teradata, SQL Server, ETL tools / processes and data warehousing architectures.
  • Extensive experience in designing analytical/OLAP and transactional/OLTP databases.
  • Proficient using ERwin to design backend data models and entity relationship diagrams (ERDs) for star schemas, snowflake dimensions and fact tables.
  • Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.
Technical skills:
Big Data Eco System
Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume.

Programming Languages
Java SE/J2EE (jdk 1.5/1.6), C , Core Java

Testing/Logging Tools
JUnit, EasyMock, JMock, log4Js

Database
Oracle 9i/11g/10g, DB2, MySQL, SQL Server, Teradata

Application Server
Apache Tomcat, Jboss, Web Sphere, Web Logic

Tools
ANT, Maven, TOAD

Operating System
Windows XP/Vista/7,Unix,Linux

PROFESSIONAL EXPERIENCE:
Confidential ,Phoenix,AZ
Core Java Developer
The Confidential Company, also known as Confidential is an American multinational financial services corporation headquartered in Confidential , United States.The company is best known for its credit card,charge card and traveler’s cheque businesses. Amex cards account for approximately 24% of the total dollar volume of credit card transactions in the US.

Responsibilities:
  • Involved in running Hadoop jobs for processing data coming from different sources.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Script.
  • Developed Simple to complex MapReduce Jobs using Hive and Pig.
  • Involved in creating Hive tables loading with data and writing hive queries which will run internally in map reduce way.
  • The logs and semi structured content that are stored on HDFS were preprocessed using PIG and the processed data is imported into Hive warehouse which enabled business analysts to write Hive queries.
  • Involved in transforming data from legacy tables to HDFS, and HBASE tables using Sqoop.
  • Involved in moving all log files generated from various sources to HDFS for further processing through Flume
  • Used Oozie scheduler to automate the pipeline workflow and orchestrate the map reduce jobs that Extract the data on a timely manner.
  • Good understanding of ETL tools and how they can be applied in a Big Data environment.
Environment:Hadoop,MapReduce,HDFS,Hive,Pig,HBase,Oozie,Cloudera,Oracle10g,DB2, SQL*PLUS, Toad, Putty, Windows NT, UNIX Shell Scripting

Confidential, Tulsa, OK
Hadoop Developer
Confidential Company is an American Multinational energy corporation with its headquarters located in the Energy Corridor district of Houston, Texas in the United States. It is the world's largest independent pure-play exploration and production company.
Responsibilities:
  • Responsible for complete SDLC management using different methodologies like Agile, Incremental, Waterfall, etc
  • Involved in ETL, Data Integration and Migration.
  • Performed ETL using Pig, Hive and MapReduce to transform transactional data to de-normalized form.
  • Extracted the data from Teradata into HDFS using Sqoop.
  • Exported the patterns analyzed back into Teradata using Sqoop.
  • Developed MapReduce jobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced in managing and reviewing Hadoop log files.
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Extensively used Pig for data cleansing.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Implemented business logic by writing Pig and Hive UDFs for some aggregative operations and to get the results from them.
  • Scheduled Oozie workflow engine to run multiple Hive and Pigjobs, which independently run with time and data availability.
Environment:Hadoop,MapReduce,HDFS,Hive,Pig,Ooozie,Java(jdk1.6),Hadoop Distribution of Cloudera, Teradata,Oracle11g/10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.
Confidential, Richmond, VA
Hadoop Developer
The Confidential is a U.S based bank holding company specializing in credit cards, home loans, auto loans, banking and savings products. The company helped the mass marketing of credit cards and it is now the fourth largest customer of the United Sates Postal Service and has the sixth largest deposit portfolio in the United States.

Responsibilities:
  • Extensive experience in Hadoop map reduce as Programmer Analyst in business requirement gathering, analysis, scoping, documentation, designing, developing and creating Test Cases.
  • Design and implement Map/Reduce jobs to support distributed data processing.
  • The logs and semi structured content that are stored on HDFS were preprocessed using PIG and the processed data is imported into Hive warehouse which enabled business analysts to write Hive queries.
  • Developed workflow in Control M to automate tasks of loading data into HDFS and preprocessing with PIG.
  • Uploaded and processed data from various structured and unstructured sources into HDFS using Sqoop.
  • Responsible for designing and managing the sqoop jobs that uploaded the data from oracle to HDFS and Hive.
  • Exported data from HDFS to RDBMS via Sqoop for Business Intelligence, visualization and user report generation.
  • Worked with business teams and created Hive queries for ad hoc access.
  • Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
  • Worked on debugging, performance tuning of Hive & Pig Jobs.
  • Analyzed business requirements and cross-verified them with functionality and features of NOSQL database like HBase, to determine the optimal DB.
  • Used Flume to collect and aggregate weblog data from different sources and pushed to HDFS.
Environment:Hadoop Distribution of Horton Works,Eclipse IDE, Linux,Map Reduce, Pig Latin, Sqoop, Java, Hive, Hbase, UNIX Shell Scripting.
Confidential , Cleveland, OH
Core Java Developer
The Confidential is an American company in the general building materials industry. The company primarily engages in the manufacture, distribution, and sale of paints, coatings. The company is mostly known through its Sherwin-Williams Paints line.
Responsibilities:
  • Involved in various phases of Software Development such as modeling, system analysis and design, code generation and testing using AGILE Methodology
  • Participated in daily Stand up meetings with Scrum Master.
  • Designed, developed and deployed application using Eclipse and Tomcat application Server.
  • Classes are designed by using Object oriented Design(OOD) concepts like encapsulation, inheritance etc
  • Involved in unit integration, bug fixing, acceptance testing with test cases, code reviews.
  • Developed SQL queries to interact with the Oracle database and used JDBC to interact with the database. Used Clear quest as bug tracking system. Extracted Logging errors by Log4j.
  • Written Test Cases for Unit Level Testing using JUnit.
  • Extensive usage of ANT builds process for the delivery of the end product.
  • Used JIRA for bug tracking, issue tracking and project management.
Environment : Java 1.6, AGILE Methodology, SCRUM,SOAP Webservices, Log4J, JUnit, Unix Shell Scripting, SVN, Oracle 9i,Spring Framework, Hibernate, Eclipse 3.2, maven 4, JIRA, Tomcat 5.5.

Kimberly Clark Corp, Neenah, WI Dec’07 – Jun’09
Core Java Developer
Confidential Corporation is an American Personal care corporation that produces mostly paper based consumer products. Confidential brand name products includes Kleenex, Kotex, Huggies, Scott and KimWipes.
Responsibilities:
  • Responsible and active in the analysis, definition, design, implementation and deployment of full software development life-cycle (SDLC) of the project.
  • Involved in Analysis, Design and Implementation/translation of Business User requirements.
  • Actively participated in the daily SCRUM meetings to produce quality deliverables within time.
  • Automate build process by writing ANT build scripts.
  • Configured and customized logs using Log4J.
  • Involved in installing and configuring Eclipse and Maven for development.
  • Used Log4J to validate functionalities and JUnit for unit testing.
  • Used SVN as a version management tool.
  • Created SQL views, queries, functions and triggers to be used to fetch data for the system.
  • Involved in the Design Document, Coding and Debugging.
  • Developed Java classes that provide JDBC connectivity to the application with Oracle database
Environment : Java, XML, MySQL, JSP, JavaScript, Servlets, JDBC, PL/SQL, XML, Log4j, JUnit, SVN, ANT, Microsoft Visio, CSS, SSO, Unix, Tomcat Server 5.0, JBuilder

Confidential, Nagpur, India
Oracle Developer
Confidential in designing and building business intelligence and data warehousing systems that provide exceptional value. The development model provides the clients with a faster, better and cheaper alternative to traditional development methodologies.
Responsibilities:
  • Worked with Data Modeling both Physical and Logical Design.
  • Developed operational plan for ETL, Data Loading and data cleaning process and wrote scripts for automation using shell scripting.
  • Backend Programming using Oracle 10g/9i.
  • Wrote stored procedures, functions and other PL/SQL blocks for automations in processing and loading data in to tables.
  • Wrote Unix Shell script to monitors Oracle instance performance, tablespace, users and objects, and send email to pager automatically.
  • Involved in designing database, modeling database and maintaining database.
  • Creation of schema objects through Erwin tool, transferring data from non-Oracle platforms to Oracle database using SQL * Loader.
  • Involved in Monitoring, Tuning, auditing users, assigning roles and privileges, backup and recovery.
  • Involved in partition designing and table space and table partitioning for data warehousing purpose.
  • Developed shell scripts for the execution of different module procedures for batch processing.
Environment : Oracle 9i,Oracle 10g,Erwin,SQL Loader, TOAD,MS Visio, UNIX Shell Scripting, Sun Solaris/HP-UX / NT

Hire Now