Hadoopdeveloper Resume
Camarillo, CA
SUMMARY:
- Have 7+ years of programming experience with skills in analysis, design, development, and deploying for large scale distributed data processing using Hadoop, Pig and Java and other various software applications with emphasis on object oriented programming.
- Excellent Experience in Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Experience in Big Data and Big Data analytics using the Hadoop eco system
- Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Hive, Sqoop, Pig, Zookeeper and Flume.
- Good Exposure on Apache Hadoop Map Reduce programming, PIG Scripting and Distribute Application and HDFS.
- Expertise in Hadoop - Big data technologies:Hadoop Distributed File System (HDFS), Map Reduce, PIG, HIVE, HBASE, ZOOKEEPER, SQOOP.
- Experience in managing and reviewingHadoop log files.
- In-depth understanding of Data Structures and Algorithms.
- Experienced in Cloudera Hadoop (CDH3, CDH4), MapR Hadoop (M5)
- Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, and Cassandra.
- Implemented in setting up standards and processes for Hadoop based application design and implementation.
- Experience in importing and exporting data usingSqoop from HDFS to Relational Database Systems and vice-versa.
- Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Experience in managing Hadoop clusters using Cloudera Manager Tool.
- Good exposure to Good Data, Google Analytics, Micro Strategy, Tableau, Phone gap, Titanium, Linux API MPEG-2, MPEG-4, H.264, SIP, RTP.
- Experienced in writing ANT and Maven scripts to build and deploy Java applications.
- Developed various subsystems based on HBase, Python, Hive, Pig, Elastic Search, MongoDB and Java.
- Designing and implementationBig Data, Business Intelligence and Analytics solutions in complex environments.
- Having Experience on Development tools like Eclipse, RAD, STS, Groovy on Grails STS.
- Extensively development experience in different IDE’s like Eclipse, Net Beans and Edit plus.
- Excellent knowledge of Java, J2EE, Web services, Rational Rose, Struts, Servlets, JSP, BEA WAS, XML, XSL, and XSD.
- Developed front-end using AWT, Flex, Swing, JSF, and JSP with Custom Tag libraries, JSTL, Struts Tag libraries, GWT, Adobe Flex, MXML, HTML, and CSS.
- Experience in MVC (Model View Controller) architecture, using Struts, AJAX and Spring Framework with various Java/J2EE design patterns.
- Conversant with Web/application Servers - Tomcat, Web sphere, Web logic and Jboss servers.
- Experience in Java, JSP, Servlets, EJB, Web Logic, Web Sphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML
- In depth knowledge of databases like MS SQL Server, Oracle 9i/10g, Sybase, MySQL and extensive experience in writing SQL queries, Stored Procedures, Triggers, Cursors, Functions and Packages
TECHNICAL SKILLS:
Big Data: Hadoop and Map Reduce, Hive, Pig, HBase, Sqoop, Flume, Zookeeper, Oozie, HCatalog,cluster Build, Impala, HUE, Data Meer, Cassandra, R-Analytics, Cloud era CDH3 &4,AWS, Ganglia, Nagios,HDFS
Languages: Java (JDBC, Swing, Multithreading), J2EE(JSP, Servlets, JSF), Apex (Force.com)
Web: Spring 2.5, Struts 2, JSP, Servlets, HTM, HTML 5, CSS, JavaScript, AJAX, XML, Bootstrap, Web Services (SOAP, REST), Java, Java Beans, J2EE (JSP, Servlets, EJB), JDBC, Web Services, MVC,JSTL, DOM, ASP.Net, AJAX, XSLT, CSS, JavaScript, JQuery
Databases: MySQL, MS SQL Server, MS ACCESS, Oracle, Oracle 10G/11G, Teradata, IBMDB2, Hadoop eco system,, DB2, (T-SQL, PL/SQL)
Tools: Macromedia Dream weaver, Eclipse IDE, Net beans IDE,RAD IDE, Junit, SVN, Visual Paradigm, Front end,Middle,Backend, MS Office, IntelliJ, Eclipse, Net Beans, FrontPage, Toad, FTP clients,Visual Studio, Web services, SOAP, REST
Framework: DWR, GWT, Dojo, Struts (1.3 &2), Spring, and Hibernate, Angular.js.
Server Tools: Apache tomcat server, Glassfish, JBoss, Web sphere, Web Logic, Apache, ftp
Platforms: Windows, Linux
ORM Frameworks: Hibernate 2.0.2.1, 3.0, iBatis, JPA
Design Patterns: MVC, Singleton, Factory, Façade
SDLC Methodologies: Agile, RUP, iterative Waterfall
PROFESSIONAL EXPERIENCE:
Confidential, Camarillo, CA
HadoopDeveloper
Responsibilities:
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
- Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
- Responsible for Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage & review Hadoop log files.
- Involved in running Hadoop jobs for processing millions of records of text data
- Installation of various Hadoop Ecosystems and Hadoop Daemons.
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
- Involved in loading data from LINUX file system to HDFS
- Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
- Deployed Hadoop Cluster in Fully Distributed and Pseudo-distributed modes.
- Developed multipleMap Reduce jobs in java for data cleaning and preprocessing
- Developed Simple to complex Map Reduce Jobs usingHive and Pig.
- Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the hadoop cluster.
- Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement
- Involved in configuring the connection between Hive tables and reporting tools like Tableau, Excel and Business Objects
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBaseNoSQL database and Sqoop
- Responsible for developing data pipeline using HDInsight, flume, Sqoop and pig to extract the data from weblogs and store in HDFS.
- Exported data from DB2 to HDFS using Sqoop and NFS mount approach
- Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
- Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs.
Environment: Hadoop, Map Reduce, HDFS, Hive, Java (JDK 1.6), Eclipse,Sqoop, PIG,Oozie,Solaris/redhat, Exadata Machines X2/X3,Big Data Cloud era CDH Apache Hadoop, Toad, SQL plus, Oracle,Linux
Confidential, Long beach, CA
HadoopDeveloper
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS and developed multiple Map Reduce jobs in Java for data cleansing and preprocessing
- Responsible for building scalable distributed data solutions using Hadoop.
- Continuous monitoring and managing the Hadoop cluster through Cloud era Manager.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
- Extracted the data from Teradata into HDFS using Sqoop.
- Extracted data files from MySql through Sqoopand placed in HDFS and processed.
- Involved in creating Hive tables, loading with data and writing hive queries which will Runinternally in map reduce way.
- Developed a custom File System plug in for Hadoop so it can access files on Data Platform.
- This plugin allows Hadoop Map Reduce programs, HBase, Pig and Hive to work
- Involved in loading data from UNIX file system to HDFs
- Setup and benchmarked Hadoop/HBase clusters for internal use
- Designed and implemented Map reduce-based large-scale parallel relation-learning system
- Developed Ant, Shell, Python scripts for Web Sphere Application server, Web Sphere portal server Installation and configurations builds.
- Involved in configuring the connection between Hive tables and reporting tools like Tableau, Excel and Business Objects.
Environment: Sub Version, Hadoop, Hive, HBase, Map Reduce, HDFS, Hive, Java (JDK 1.6), Hadoop Distribution of Cloud era, Data tax, IBM Data Stage 8.1,PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, Tableau, UNIX Shell Scripting.
Confidential, PA
Java/J2ee/Hadoop Developer
Responsibilities:
- Installed and configured Hadoop Map reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- Experience in installing, configuring and using Hadoop Ecosystem components.
- Participated in development/implementation of Cloud era Hadoop environment.
- Experienced in managing and reviewing Hadoop log files.
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBaseNoSQL database and Sqoop
- Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
- Develop test case automation using Shell, Perl and Python scripts.
- Developed a custom File System plug in for Hadoop so it can access files on Data Platform.
- Automated workflows using shell scripts pull data from various databases into Hadoop
- Developed user interfaces using JSP, JSF frame work with AJAX, Java Script, HTML, DHTML, and CSS and developed REST/SOAP Web services
- Experienced in web/application servers like Apache Tomcat, Web Sphere, JBoss 4.2.2 and Web logic
- Used Eclipse IDE as development environment to design, develop, Spring Components on Tomcat.
- Developed the UNIX shell scripts to start and stop the Java scheduled jobs, Developed file handling processes and data load programs using Core Java.
- Developed and implemented Swing, spring and J2EE based MVC (Model-View-Controller) framework for the application.
- Developed Ant, Shell, Python scripts for Web Sphere Application server, Web Sphere portal server Installation and configurations builds.
- Developed front-end using AWT, Flex, Swing, JSF, and JSP with Custom Tag libraries, JSTL, Struts Tag libraries, GWT, Adobe Flex, MXML, HTML, and CSS. user-interactive using jQuery plug-in for Drag and Drop,or,Auto Complete, AJAX, JSON, AngularJS, Backbone JS and JavaScript, Bootstrap
Environment: CoreJava,JSP,JSF,Grails,SOAP,JavaScript,HTML,CSS,Junit,Hadoop,Hbase,HdfsMapReduce,EJB,Python,XML,Eclipse,SQL,Hibernate,Tomcat,MyEclipse,JavaScript,JUnit,IBM/SQLSpring Framework, Struts, GWT
Confidential, Peoria, IL
Java/J2ee/Hadoop Developer
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS and developed multiple Map Reduce jobs in Java for data cleansing and preprocessing
- Importing and exporting data into HDFS and Hive using Sqoop
- Used JAVA, J2EE application development skills with Object Oriented Analysis and extensively involved throughout Software Development Life Cycle (SDLC)
- Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures
- Extracted files from CouchDB through Sqoop and placed in HDFS and processed
- Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS
- Load and transform large sets of structured, semi structured and unstructured data
- Supported Map Reduce Programs those are running on the cluster
- Wrote shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions
- Involved in loading data from UNIX file system to HDFS, configuring Hive and writing Hive UDFs
- Utilized Java and MySQL from day to day to debug and fix issues with client processes
- Implemented partitioning, dynamic partitions and buckets in HIVE
Environment: Hadoop, Map Reduce, HDFS, Hive, CouchDB, Flume, Oracle 11g, Java, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6. Java, JDBC, JNDI, Struts, Maven, SQL language, Oracle, XML, Eclipse
Confidential
Java/J2ee Developer
Responsibilities:
- Involved in creation of requirements and high level design specification for caTissue Suite Improvements implementation with client’s business team.
- Development of a split billing system - core java, collections, spring, hibernate, mysql.
- Developed user interfaces using JSP, JSF frame work with AJAX, Java Script, HTML, DHTML, and CSS and developed REST/SOAP Web services,
- Developed and implemented Swing, spring and J2EE based MVC (Model-View-Controller) framework for the application AutoComplete, AJAX, JSON
- Developed the UNIX shell scripts to start and stop the Java scheduled jobs, Developed file handling processes and data load programs using Core Java.
- Developed the web services by using WSDL, SOAP using SOA
- Designed and developed components under J2EE architecture using spring, JSP, Servlets, Hibernate and JMS.
- Developed front-end using AWT, Flex, Swing, JSF, and JSP with Custom Tag libraries, JSTL, Struts Tag libraries, Adobe Flex, MXML, HTML, and CSS.
- Designed and developed business processes using Java Beans.
- Used Spring Framework for Dependency injection and integrated with the Hibernate.
- Configured Hibernate, Spring and My Faces (JSF) to map the business objects to Mysql Database using XML configuration file
- Designed andcoded Front end using JSP/Servlets, JavaScript and ATG framework using the catalogue.
- Identified the security loop holes, computer security vulnerability like cross site scripting that enables malicious attackers to inject client-side script into web pages viewed by other users.
- Automated the build deployment and developed the continuous build deployment and test environment.
Environment: Core Java, JSP, Servlets, JQuery, Spring Framework, Struts, Flex, Hibernate, Tomcat, Hibernate,Tomcat,Eclipse3.x,MyEclipse,HTML,JavaScript,JUnit,XML,IBM/SQL,Tomcat5,Oracle, Log4J.Web Services-WSDL.
Confidential - Charlotte, NC
Java Consultant
Responsibilities:
- Designed and developed user interface using Struts tags, JSP, HTML and JavaScript.
- Developed user specific Highlights(dashboard menu) section, Home page, Admin home page, user module (Modify/search users, create users screens with assigning various roles) using Spring MVC framework, Hibernate ORM Module, Spring Core Module, XML, JSP and XSLT.
- Involved in multi-tiered J2EE design utilizing MVC architecture (Struts Framework) and Hibernate.
- Implemented functionality using Servlet, JSP, HTML and Struts Framework., Hibernate, spring, Java Scripts and Web logic.
- Involved in implementing and maintaining large content driven and E-commerce based application.
- Developed Scalable applications using Stateless session EJBs.
- Used Axis Web Services using SOAP to transfer the amount from an application that is remote and global to different financial institutions
- Involved in designing the user interfaces using HTML, CSS, and JSPs.
- Configured Hibernate, Spring and My Faces (JSF) to map the business objects to Mysql Database using XML configuration file
- The required changes to the record and save the updated information back to the database.
- Involved in writing shell script to export oracle table's data into flat files and performed unit testing using JUNIT and used Log4j for logging and automatic batch jobs.
- Developed stored procedures and triggers using PL/SQL in order to calculate and update the tablestoimplement business logic.
Environment: Core Java, JSP, Servlets, Struts Framework, Hibernate Framework, Oracle, Unix Shell Scripts, XSL, XSLT, Eclipse 3.x, My Eclipse, HTML, UML, Java Script, JUnit, JAXP, XML, SQL, Log4J