Hadoop Engineer Resume
NJ
SUMMARY
- Overall11+ years' of professional IT experience with5 yearsof experience in analysis, architectural design, prototyping, development, Integration and testing of applications using Java/J2EE Technologies and5 yearsof experience inBig Data AnalyticsasHadoop Developer.
- 5 yearsof experience asHadoop Developerwith good knowledge in Hadoop ecosystem technologies.
- Experience in developingMap Reduce ProgramsusingApache Hadoopfor analyzing the big data as per the requirement.
- Experienced on major Hadoop ecosystem's projects such asPIG, HIVE, HBASEand monitoring them withCloudera Manager
- Extensive experience in developingPIG Latin Scriptsand usingHive Query Language and Impalafor data analytics
- Hands on experience working on Hbase
- Good working experience usingSqoopto import data intoHDFSfromRDBMSand vice - versa.
- Good Working experience on real time data ingestion using Flume
- Good knowledge in using job scheduling and monitoring tools likeOozieandZooKeeper
- Experience inHadoop administration activitiessuch as installation and configuration of clusters usingApache, ClouderaandAWS
- Developed UML Diagrams for Object Oriented Design: Use Cases, Sequence Diagrams and Class Diagrams using Rational Rose,Visual Paradigm and Visio
- Experienced in creative and effective front-end development usingJSP, JavaScript, HTML 5,DHTML, XHTML AjaxandCSS
- Working knowledge of database such asOracle8i/9i/10g,MicrosoftSQL Server,DB2
- Experience in writing numerous test cases usingJUnitframework withSelenium.
- Strong experience in database design, writing complexSQLQueries andStored Procedures
- Experienced in using Version Control Tools likeSubVersion, Git
- Have extensive experience in building and deploying applications onWeb/Application Servers likeWeblogic, Websphere,andTomcat
- Experience in Building, Deploying and Integrating withAnt, Maven
- Experience in development of logging standards and mechanism based onLog4J
- Strong work ethic with desire to succeed and make significant contributions to the organization
- Strong problem solving skills, good communication, interpersonal skillsand agood team player
TECHNICAL SKILLS
Big Data Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flime, Hbase, Cassandra, Oozie, Zookeeper, YARN.
Programming Languages: Java JDK1.4/1.5/1.6 JDK 5/JDK 6, C/C, Matlab, R, HTML, SQL, PL/SQL
Operating Systems: UNIX, Windows, LINUX
Application Servers: IBM Web sphere, Tomcat, Web Logic, Web Sphere
Web technologies: JSP, Servlets, Socket Programming, JNDI, JDBC, Java Beans, JavaScript, Web Services JAX-WS
Databases: Oracle 8i/9i/10g, Microsoft SQL Server, DB2 MySQL 4.x/5.x
Java IDE: Eclipse 3.x, IBM Web Sphere Application Developer, IBM RAD 7.0
Tools: TOAD, SQL Developer, SOAP UI, ANT, Maven, Visio, Rational Rose
Version Control: CVS,SVN and GIT
PROFESSIONAL EXPERIENCE
Confidential, NJ
Hadoop Engineer
Responsibilities:
- As as hadoop developer create design and implemente solution to ingest data from different source systems into Hive
- Worked with different DB team to pull the data into hadoop environment.
- Setting up the connection between hadoop and BI Tools like Tableau
- Creating the sqoop scripts and automate the sqoop jobs
- Creating the hive tables and schemas.
- Working with other interface team to solve the business problems.
- Working with the business people and prepare the data as per their need.
- Ingest Data into Hive and Impala from different data sources (both ORACLE,SQL and Teradata)
- Worked on REST API to pull and push the data
- Worked on SPARK SQL and Scala
- Worked on Alteryx work flow tool.
- Automating sqoop jobs using oozie.
- Data parsing while loading into hive using UDFs.
- Working with admin team to implement authentication and authorization to our application using Kerberos.
- Sketches thebigdatasolution architecture, then monitors and governs the implementation
- Followed Agile methodology for development.
Environment: Hadoop 1.2.1, MapReduce, Sqoop 1.4.4, Hive 0.10.0, Flume 1.4.0, Oozie 3.3.0, Pig 0.11.1, Hbase 0.94.11, Scala, Hcatalog, Zookeeper 3.4.3, Talend Open Studio v1.10, Talend 5.5, Oracle 11g/10g, SQL Server 2008, Kafka, MySQL 5.6.2, Java, SQL, PL/SQL, UNIX shell script, Eclipse Kepler IDE
Confidential, Austin, TX
Hadoop developer
Responsibilities:
- Interact with customers, business partners and all stakeholders to understand the business objective and drive solutions that effectively meet the needs of a client.
- Sketches thebigdatasolution architecture, then monitors and governs the implementation.
- Design strategies and programs to collect, store, analyze and visualizedatafrom various sources.
- Participated in development and execution of system and disaster recovery processes and actively collaborated in all Security Hardening processes on the Cluster.
- Support thedataanalysts and developers of BI, java script and for Hive/Pig development.
- As Hadoop Developer, managed review, capacity planning, technical consultation, performance optimization for production clusters.
- Job duties involved the design, development of various modules in Hadoop Big Data Platform and processing data using MapReduce, Hive, Pig, Scoop, Oozie, Kafka and Storm.
- Integrated Apache Storm with Kafka to perform web analytics. Uploaded click stream data from Kafka to HDFS, Hbase and Hive by integrating with Storm.
- Responsible for Hadoop training for team and authored guidelines, best practices, patterns/anti-patterns, checklist, and FAQ documents for creating Hadoop.
- Lead technical solution blueprinting, training and adaption programs for Kerberos-Hadoop (version 0.20.100) and YARN (version 0.23).
- Managed Data Analytics team responsible for consulting and supporting Hadoop applications.
- Co-ordinated with offshore team members in completing the assigned tasks.
Environment: Hadoop 1.2.1, MapReduce, Sqoop 1.4.4, Hive 0.10.0, Flume 1.4.0, Oozie 3.3.0, Pig 0.11.1, Hbase 0.94.11, Scala, Hcatalog, Zookeeper 3.4.3, Talend Open Studio v1.10, Talend 5.5, Oracle 11g/10g, SQL Server 2008, Kafka, MySQL 5.6.2, Java, SQL, PL/SQL, UNIX shell script, Eclipse Kepler IDE
Confidential, Waltham, MA
Senior Hadoop developer
Responsibilities:
- Interact with customers, business partners and all stakeholders to understand the business objective and drive solutions that effectively meet the needs of a client.
- Sketches thebigdatasolution architecture, then monitors and governs the implementation.
- Design strategies and programs to collect, store, analyze and visualizedatafrom various sources.
- Participated in development and execution of system and disaster recovery processes and actively collaborated in all Security Hardening processes on the Cluster.
- Upgrade the Hadoop Cluster from CDH 4.1 to CDH4.7Cloudera Distribution.
- Support thedataanalysts and developers of BI, java script and for Hive/Pig development.
- As Hadoop Developer, managed review, capacity planning, technical consultation, performance optimization for production clusters.
- Job duties involved the design, development of various modules in Hadoop Big Data Platform and processing data using MapReduce, Hive, Pig, Scoop, Oozie, Kafka and Storm.
- Integrated Apache Storm with Kafka to perform web analytics. Uploaded click stream data from Kafka to HDFS, Hbase and Hive by integrating with Storm.
- Responsible for Hadoop training for team and authored guidelines, best practices, patterns/anti-patterns, checklist, and FAQ documents for creating Hadoop.
- Lead technical solution blueprinting, training and adaption programs for Kerberos-Hadoop (version 0.20.100) and YARN (version 0.23).
- Managed Data Analytics team responsible for consulting and supporting Hadoop applications.
- Co-ordinated with offshore team members in completing the assigned tasks.
Environment: Cloudera DistributionCHD 4.1/4.7/5, Hadoop 1.1 X/2.X,MapR 3.1, Sqoop, Oozie 3.2.0, Pig 0.9, Hbase 0.93.,Apache hive 0.9,Apache Zookeeper, Talend Open Studio 5.5.0, Oracle 11g/10g, SQL Server 2008, Kafka, MySQL 5.6.2, Java, SQL, PL/SQL, UNIX shell script, Eclipse Kepler IDE
Confidential
Java Developer
Responsibilities:
- Working effectively with Stakeholders (Business and Technical including Executive Management); liaising with third party vendors, global team and system integrators.
- Involved in understanding client requirements and translating them to technical requirements and prepared a detailed Function Specification document (FSD).
- Involved in performance testing & defects fixing.
- Adding Some Java applications requirements make integration with a scripting language necessary. If your users may need to write scripts that drive the application, extend it, or contain loops and other flow-control constructs.
- In such cases, it's sensible to support a scripting language interpreter that can read user scripts, and then run them against Java application's classes.
- To accomplish that task, running a Java-based scripting language interpreter in the same JVM as a application.
- Implemented Cairngorm framework to develop the dynamic user interfaces to provide messaging and Dependency Injection.
- Used services like the Remote object, HTTP Services and Web-Services for data communication using Blaze DS/LCDS.
- Developed Custom Events and Custom Components in various instances of the application.
- Used Hibernate to provide database connectivity to database tables in Oracle.
- Used various Core Java concepts such as Multithreading, Exception Handling, Collection APIs to implement various features and enhancements.
- Applied Java/J2EE Design Patterns like Intercepting Filter, Front Controller, Composite View, Dispatch View, Business Delegate, Service Locator, Value Objects, DAO, and Singleton etc.
- Deployment, Unit and Regression testing usingFlexUnit.
- Worked on critical defects like memory leakage, performance related issues and analyzing the log files to track the issues.
- Designed and debugged the system with the help of IDEs like Flash Builder and Eclipse.
- Used JIRA& Quality Center tools for bug tracking.
- Used SVN for version control.
- Worked on preparing test cases and executing unit testing and Integration testing.
Environment: Java 1.5, JSP, Servlets, WebSphere Application Server 7.0, Hudson, Nexus, Eclipse, JQuery, JSON, JavaScript, CSS, Velocity Engine, JSP, JSTL, HTML, DB2, SQL/PL-SQL, XML, JUnit, Log4j, SVN, Maven 2.2, Windows.
Confidential
Web Developer/Java Developer
Responsibilities:
- Designed and implemented HTML 5 basedmobileweb applications and user interfaces formobile platforms (iOS, Android, Windows Phone and Kindle).
- Designed, developed and supported HTML5 based applications, predominantly for use onmobile devices.
- Provided expertise for touch screen UI utilizingHTML5andJavaScriptincluding 2D Canvas, File system API, Asset Loading, Manifest Caching, Progress Bar Indicators, Local Storage and Web Workers for next generationHTML5Application to facilitate sales process.
- Architecture featuring Model-View-Controller modular object orientedJavaScript, AJAX and server-side JSP.
- Documented entire system for stake holder review, wireframe review and iterative enhancement and led the corresponding meetings.
- User Centered Design featuring Personas, Task Models, User Journeys, Content Requirements, Sitemaps and Usability Test Reports.
- Multiple REST Based Web Services with XML Payloads and large datasets normalized into JSON for application Integration.
- It's sensible to support a scripting language interpreter that can read user scripts, then run them against Java application's classes.
- To accomplish that task, running a Java-based scripting language interpreter in the same JVM as a application.
- Produced cross-browser compliant web applications based on client provided comps using HTML5, CSS, jQuery, JSON, and MySQL.
- Developed responsive layouts for different screen sizes and resolutions.
- Worked with graphic artists, manipulating images, and precisely matching UI mockups.
- Demonstrated expert-level understanding and proficiency of HTML 5, CSS, AJAX and JavaScript and a strong sense of aesthetics and UI/UX.
- Written semantic relevant HTML.
- Implemented AJAX interactions with back-end services.
- Used Front-end JS libraries like JQuery, angular.js.
- Extensively used JavaScript object-oriented programming.
- Produced top-notch code that maintains the integrity of the system design and ensures compliance of code to craftsmanship standards.
- Provide feedback on functionality requests regarding feasibility and complexity.
Environment: Java, J2EE, JavaScript, My Eclipse, Weblogic HTML5, CSS3, AJAX, XHTML, CSS, jQuery, JQuery Mobile, Adobe Flex and Action Script and XML.
Confidential
Java Developer
Responsibilities:
- JSP pages designed using struts tag libraries, HTML, DHTML, JSP, AJAX andJavaScript.
- Used Hibernate for establishing connection and interacting with the database
- Created connections to database using Hibernate Session Factory, using Hibernate APIs to retrieve and store images to the database with Hibernate transaction control.
- The front-end JSP pages were developed using the Struts framework, and were hosted in a J2EEenvironment on Apache tomcat server.
- Integrated the application with Struts Validation framework to do business validations
- Involved in the development of CRUD (Create, Update and Delete) functionality for various administrative system related tables and product components.
- Conducted Unit Testing, interface testing, system testing and user acceptance testing.
- Developed the presentation layer written using JSP, HTML, CSS and client-side validations were done using JavaScript, jQuery, and JSON.
- Developed and flexible, scalable, utilizing open source technologies like Hibernate ORM and Spring Framework.
- Designed additional UI Components usingJavaScript and implemented an asynchronous, AJAX Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS.
- Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Supported code/design analysis, strategy development and project planning.
- Created reports for the BI team using Sqoop to export data into HDFS and Hive.
- Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
- Assisted with data capacity planning and node forecasting.
- Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Administrator for Pig, Hive and Hbase installing updates, patches and upgrades
- Based rich client to improve customer experience.
- Designed static and dynamic Web Pages using JSP, HTML, CSS and SASS.
- Developed application using spring frame work.
- Built scripts using MAVEN and deployed the application on the JBoss application server.
- Updated the maintained the sequence diagrams for the given Design.
- Used Web Logic Application Server for deploying various components of application.
- Developed the User Interface Screens for presentation logic using JSP, CSS, and HTML client validation scripts using JavaScript.
Environment: Java, J2EE, Struts, JSP, HTML, DHTML, AJAX,JavaScript, CSS, Oracle, SQL, XML, JQuery, JSON, Maven, Hibernate ORM, Spring framework and Tomcat 6.