Hadoop/java Developer Resume
Newark, NJ
SUMMARY
- 7+ years of experience in teh IT industry, which includes experience in Java application development and 2 years of experience in Hadoop
- A former Java programmer wif newly acquired skills, an insatiable intellectual curiosity, and teh ability to mine hidden gems located wifin large sets of structured, semi - structured and unstructured data.
- Hands on experience in Hadoop ecosystem components like MapReduce, HDFS, Sqoop, Pig, Hive and Oozie.
- Extensively working on Spark and Shark.
- Working on Spark Streaming wif Flume online streaming.
- Good Experience on Media Analytics.
- Expert in working wif Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing teh HiveQL queries.
- Experience in using Apache Sqoop to import and export data to and from HDFS and Hive.
- Hands on experience in setting up workflow using Apache Oozie workflow engine for managing and scheduling Hadoop jobs.
- Good knowledge of No-SQL databases-Cassandra and HBASE.
- Experience in using Hcatalog for Hive, Pig and Hbase.
- Working on Pentaho data integration (PDI) Kettle - Extraction, Transformation, and Loading (ETL) and Tableau visualization.
- Experience in working wif BI team and transform Big Data requirements into Hadoop centric technologies.
- Experience in Hadoop MapReduce programming, Pig scripting, HiveQL and HDFS.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems.
- Experience wif Oozie workflow engine to run multiple Hive and Pig jobs independently wif time and data availability.
- Experience in application development using Java, J2EE, EJB, Hibernate, JDBC, Jakarta Struts, JSP and Servlets.
- Proficient in database development: Oracle, MySQL.
- Good understanding of some of key design concepts, design patterns, and UML.
- Experience in all phases of systems development.
- Strong technical and interpersonal skills combined wif great commitment towards meeting deadlines.
- Experience working in both team and individual environments. Always eager to learn new technologies and implement them in challenging environment.
- Excellent written and verbal communication skills.
- Strong analytical and problem solving skills.
TECHNICAL SKILLS
Hadoop: HDFS, Spark, MapReduce, Sqoop, Pig, Hive and Oozie.
Languages: Java, J2EE (JSP, Servlets, EJB), JavaScript, C, C++, HTML, DHTML, XML, CSS and UML.
DBMS: Oracle 9i, MySQL.
Frameworks: Jakarta Struts, Hibernate, spring.
Web/Application Servers: Tomcat, Bea WebLogic, JBoss, Webscripting.
Version Control: CVS, SVN.
Reporting Tools: JIRA
IDE’s: NetBeans 6.0, Eclipse 3.3.x and Exadel Studio and Cloudera 4.x.
Operating Systems: UNIX, Windows
PROFESSIONAL EXPERIENCE
Confidential, Newark, NJ
Hadoop/Java Developer
Responsibilities:
- Involved in capacity planningBig Dataplatform.
- DevelopedMap-Reduceprograms inJavafor data cleaning and preprocessing. And responsible for collecting Data required for testing various Map Reduce applications from different sources
- CreatedMap Reducejobs for data transformations and data parsing.
- Developed Map reduce programs were used to extract and transform teh data sets and results were exported back toRDBMSusingSqoop.
- Designed workflow by schedulingHiveprocesses for Log file data which is streamed intoHDFSusing Flume.
- CreatedHivescripts for extracting teh summarized information from hive tables.
- WrittenHive UDFSto extract data from staging tables.
- DevelopedSQLscripts to compare all teh records for every field and table at each phase of teh data
- Involved in creatingHivetables, loading wif data and writing hive queries run internally inmapReduce way.
- Movement process from teh original source system to teh final target
- Experience in setting up teh monitoring and alerting systems using teh open source tools likeGanglia and Nagios. Write customNagiosscripts as per teh infrastructure.
- Volume testing to calculate cluster's throughput.
- ManagedHadoopclusters: setup, install, monitor, maintain
- Helped teh team to increaseClusterfrom 22 Nodes to 80 Nodes.
- Maintain System integrity of all sub-components (primarilyHDFS, MRUnit).
- Monitor System health and logs and respond accordingly to any warning or failure conditions.
- Provided ad-hoc queries and data metrics to teh Business Users usingHive, Pig
- Coding & peer review of assigned task.
- CreatedMULE ESBartifact and configured teh MULE config files and deployed.
- Involved in creatinghttp inbound & outboundflows, custom java and xslt transformers and Security of mule endpoint throughWSSR.
- Code Coverage &JUNITTest case preparations.
- Unit testing and Volume Testing.
- UATbug fixing.
- Coordination wif onsite counterpart and Client Interaction
- Debugging and troubleshooting teh issues in development and Test environments.
- Conducting root cause analysis and resolve production problems and data issues.
- Proactively involved in ongoing maintenance, support and improvements in Hadoop cluster.
Environment: JAVA, JUNIT, Analytical Skills, Eclipse Indigo, Database- Hive, Sonar, Crucible, EclEmma., OS- Mac OS X Lion, Cloudera Hadoop Distribution (CDH4), Hive, Pig Latin, Cloudera Manager, Puppet, Ganglia and Nagios
Confidential, CA
Hadoop Developer
Responsibilities:
- Ingesting Terabytes of data per day for analysis and data mining.
- Experience in collection and analysis of real time call data records.
- Big data analysis using Pig, Hive and UDF.
- Generated lot of reports about teh connections made by teh user.
- Performed joins, group by and other operations in MapReduce using Java or PIG Latin.
- Created Map Reduce Jobs to parse teh logs stored in HDFS using Pig Latin and Hive Queries.
- Processed teh output from PIG, Hive and formatted it before sending to teh Hadoop output file.
- Experience in setting up Virtual Machines and managing storage devices.
- Involved in managing and reviewing Hadoop log files.
- Perform analytics on teh weblog generated by web server. Also get a visualization of teh analysis performed.
- dis report helps to identify patterns of usage of user interest.
- Implemented daily workflow for extraction, processing and analysis of data wif Oozie.
- Responsible for troubleshooting MapReduce jobs by reviewing teh log files.
- Developed Scripts and Batch Jobs to schedule various Hadoop Program.
- Used Sqoop tool to load data from RDBMS into HDFS.
- Created Reports and Dashboards using structured and unstructured data.
- Creating indexes and tuned teh SQL queries in Hive.
Environment: CDH, Core Java, MapReduce, Hive, Sqoop, Unix, shell scripting, Oozie, Pig, Hue, Vertica, HBase, Tableau.
Confidential, CA
Java Application Developer
Responsibilities:
- Working as a Java developer / Front End developer to create framework which can be used in all teh Confidential Portal applications
- Involved in creation of ECI Portal.
- Provided RESTful services for use in ECI Wholesale portal which provides facilities for Confidential wholesale customers to log in and manage and order network services and report, track network issues.
- Employed Jasper Reports for reporting, Spring 3.1 ehcache for caching, used Spring Integration and JAX-WS, JAX-RPC to integrate wif external systems.
- Involved Confidential One Portal - Self Serve application upgrade.
- Involved in Re-design of teh Confidential Self-Serve application- dis application caters to all teh consumer user of teh Confidential which allows them to manage their connections.
- Involve in teh architecture change over teh existing application which would allow teh publishers wifin Confidential to has more control of teh application by creating individual modules for all teh functionalities in teh application.
- Involve in design and development/enhancement of application to allow better management of teh application wif reduced development cost and increased stability and quality using Test Driven Development approach.
- Wholesale Order Management Systems using Agile Methodologies.
- Support to teh custom workflow and enhancements on teh existing workflows.
- Done in Java Spring MVC, wif JSP and JavaScript frontend and Soap xml or jax-ws service calls to backend systems.
- Responsible for teh development, unit testing, deployment to system and integration test servers for teh testing team and eventually production support for each piece of work.
- One milestone delivered allows Confidential customers to view their bills online and eliminate teh need for paper bills.
Environment: Java / J2EE, Spring, jax-ws, axis, xml, jsp, JavaScript, json, Eclipse, Junit, TestNG, Ant, Maven, Jira, Quality Centre, Agile, Waterfall.
Confidential, Scottsdale, AZ
Java Developer
Responsibilities:
- Involved in creation of storefront using Confidential platform.
- A store front will has catalog of products, Order placement and Order tracking functionalities similar to teh e-commerce sites.
- Developed code to integrate SUN Microsystems platform to Confidential storefront platform through webservice.
- Involved in enhancement of teh Order Management System. A set of Java/JMS-based processes dat handled payment, shipment and product returns in a multi-client shared system.
- Key player in developing feature changes to teh OMS.
- Developed an order management web interface for use by Confidential clients and staff using Struts and JSP.
- Created numerous custom enhancements to teh OMS to support teh needs of Confidential clients.
- Done unit testing to teh product development process and created an automated build system for all new products.
- Involved in fixing teh bugs onVcommerce Ontrack tool.
- Involved in SVN code checkout process.
- Written and validated technical test cases for SUN Microsystems and for other storefronts of Confidential (Samgoody, Suncoast, Target and etc.).
- Written Business User Guide for Samgoody storefront.
Environment: Java, JSP, HTML, XML, Servlets, Struts Framework, Tomcat5.5, Apache2.2 and PostgreSQL.
Confidential
Java Developer
Responsibilities:
- Involved in all teh development phases of SDLC including gathering requirements, documenting teh requirements as Use case documents.
- Designed, deployed and tested Multi-tier application using teh Java technologies.
- Involved in front end development using JSP, HTML & CSS.
- Implemented teh Application using spring MVC Framework
- Deployed teh application on Oracle Web logic server
- Implemented Multithreading concepts injavaclasses to avoid deadlocking.
- Used MySQL database to store data and execute SQL queries on teh backend.
- Prepared and Maintained test environment .Tested teh application before going live to production.
- Documented and communicated test result to teh team lead on daily basis.
- Involved in weekly meeting wif team leads and manager to discuss teh issues and status of teh projects.
Environment: J2EE (Java, JSP, JDBC, Multi-Threading), HTML, Oracle Web logic server, Eclipse, MySQL, JUnit