Hadoop & Spark With Sla (sr. Developer) Resume
CA
SUMMARY:
- Software professional having over all 9 years of Industry Experience as a Big Data/Hadoop and JAVA Techno/functional Consultant, which includes 2.8 years with Big Data/Hadoop and 5+ years of experience in JAVA.
- Expertise in writing Hadoop Jobs for analysing data using MapReduce, Hive, Pig & SparkSQL.
- Proven Knowledge of workflow/ schedulers.
- Ability to deal with Data Loading tools like Flume and Sqoop.
- Expertise in writing Pig Latin Scripts and HiveQL.
- Worked on real - time, in-memory processing engines such as Spark.
- Experienced in administrative tasks such as installing, configuring, commission & de-commission nodes, troubleshooting, backups and recovery of Hadoop and its ecosystem components such as Hive, Pig, Sqoop, HBase and Spark.
- Good knowledge on Cassandra .
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Experienced in extending Hive and by writing custom UDFs using Java.Pig core functionality.
- Good understanding of NoSQL Data bases like HBASE and Mongo DB.
- Comprehensive knowledge in Debugging, Optimizing and Performance Tuning of DB2, Oracleand MySQL databases.
- Good knowledge on Kafka.
- Experience in writing shell scripts to dump the Shared Data from MySQL server to HDFS.
- Experience in preparing functional specifications according to client requirements.
- Extensive experience in application development. This involves preparation of program specification, Analysis, Coding Testing and Support.
- Good understanding with Elastic MapReduce (EMR) on Amazon Web Services (AWS).
- Experienced with Maven, Jenkins and GIT.
- Good experience in Hive partitioning, bucketing and perform different types of joins on Hive tables and implementing Hive SerDe like JSON and Avro.
- Experience in using different file formats - Avro, Sequence Files, ORC, JSONand Parquet.
- Experience in Performance Tuning, Optimization and Customization.
- Experience with Unit Testing MapReduce programs using MRUnit & JUnit.
- Experience in Active Development as well as onsite coordination activities in web based, client/server and distributed architecture using Java, J2EE which includes Web services, Spring, Struts, Hibernate and JSP/Servlets along with incorporating MVC architecture.
- As part of my assignments, I have been working on projects for leading clients, which includes understanding client Requirements, Estimations, Analysis of Functional specifications, Technical Design Specification, Review of Technical Design, Development, Testing and Implementation activities.
- During this period, I have also acquired strong knowledge of Software Quality Processes and SDLC (Software Development Life Cycle).
- Good working knowledge on servers like Tomcat, Web Logic 8.0.
- Extensively worked on Java development tools, which includes Eclipse Galileo 3.5, Eclipse Helios 3.6, Eclipse Mars 4.5, and WSAD 5.1.2.
- Ability to work in teams as well as an individual, quick learner and able to meet deadlines.
- Strong Knowledge of web-based architecture, Strong hands on technical debugging and troubleshooting experience with distributed, enterprise applications and having knowledge of full life cycle software development (SDLC).
- Experience in ASAP and Agile Methodologies.
TECHNICAL SKILLS:
Big Data Ecosystems: Hadoop,Yarn, HDFS, MapReduce, Spark,scala(Jodaverison).HBase, Hive, Pig, Sqoop, Flume, Oozie and Zookeeper.
Programming Languages: C, Java 6.0, Scala 2.11, SQL, PL/SQL, PIG Latin, HiveQL and Unix shell scripting.
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC and EJB’s.
Frameworks: Spring 3.0.5, Hibernate 3.5.1, Struts 1.3.10, EJB, JUnit andMRUnit.
Web Services: HTML and XML, RESTful and SOAP.
Databases (RDBMS and NOSQL): NoSQ (Mongo db,Hbase),Teradata,Oracle 8i/9i/10g, MSSQL Server 2000 and DB2.
Web and Application Server: Apache Tomcat 7.0, Apache Tomcat 6.0 and Web Logic 8.0.
Messaging Tool: Storm 0.9.5 and Apace Kafka 0.9.0.0.
Methodologies: ASAP and AGILE methodology.
PROFESSIONAL EXPERIENCE:
Confidential, CA
Hadoop & Spark with Scala (Sr. Developer)
Roles & Responsibilities:
- Modifying the existing components as per the requirement.
- Responsible for gathering and analysis of requirements for creating new components.
- Creating many UDF’s with complex types as input and outputs for analysis and aggregations.
- Experienced in working with SPARK eco system using SCALA and HIVE Queries on different data formats like Text file and parquet.
- Importing and exporting data from HDFS to RDBMS and vice-versa using SQOOP.
- Experienced in working with Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
- Developed POC's with SparkSQL.
- Created job flows using oozie and crontab.
- Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
- Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis. worked with ETL tools for extracting the data from multiple sources.
- Good knowledge Kafka and storm for Real-time streaming. worked with ZooKeeper serves which coordination interface between the Kafka broker and consumers.
- Code review as per the customer coding standards.
- Testing and providing the valid test data to users as per requirement.
Environment: CDH 5.0.0,Spark,scala,Hive,Sqoop,Oozie and MapReduce.
ConfidentialHadoop Developer
Roles & Responsibilities:
- Extracted the data from Teradata into HDFS using Sqoop.
- Created and worked Sqoop jobs with incremental load to populate Hive External tables.
- Extensive experience in writing Pig (version 0.12.0) scripts to transform raw data from several data sources into forming baseline data.
- Developed pig scripts for end user / analyst requirements to perform ad hoc analysis.
- Solved performance issues in Hive and Pig scripts with understanding of Joins, Group and aggregation and how does it translate to MapReduce jobs.
- Developed UDFs in Java as and when necessary to use in PIG and HIVE queries.
- Experience in using Sequence files, RCFile, AVRO and HAR file formats.
- Developed Oozie workflow for scheduling and orchestrating the ETL process.
- Impact Analysis of Client Requirements.
- Coordinate and communicate with Onsite team and preparing technical design documents.
- Written MapReduce Code for Different Masking Techniques.
- Worked with both MapReduce 1 (Job Tracker) and MapReduce 2 (YARN) setups.
Environment: CDH 5.0.0,Sqoop,Oozie,MapReduce,Teradata,UNIX,Hive,Hbase and Pig.
ConfidentialSr.Consultant(Team Lead, Bangalore)
Roles & Responsibilities:
- Involved in understanding the requirements and designed the Database.
- Implemented Spring Controller classes, Command beans and mapped the controller classes based on annotations mapping.
- Implementedspring security mechanism for role based accessing of the portal.
- Developed code for entire Customer Module and Registration of new Merchants.
- Developed code for Recharge of mobile wallet through cash, cheque and online portal.
- Developed the user interface using JSP, CSS and jquery.
- Implemented Configuration files for Hibernate and spring.
- Developed code for Entity classes to map data base tables.
- Developed code for generating the transaction reports in CSV, PDF and Excel formats.
- Implemented logging using Log4j.
Environment: Spring, Hibernate, Log4j, JSP, Apache CXF and MySql.
ConfidentialSoftware Developer
Roles and responsibilities:
- Involved in the complete SDLC software development life cycle of the application from requirement gathering and analysis to testing and maintenance.
- Implemented the User Login logic using Spring MVC framework encouraging application architectures based on the Model View Controller design paradigm.
- Generated Hibernate Mapping files and created the data model using mapping files.
- Developed UI using JavaScript, JSP, HTML and CSS for interactive cross browser functionality and complex user interface.
- Used Struts Tiles and Validator framework in developing the applications.
- Developed action classes and form beans and configured the struts-config.xml.
- Provided client side validations using Struts Validator framework and JavaScript.
- Created business logic using servlets and session beans and deployed them on Apache Tomcat server.
- Created complex SQL Queries, PL/SQL Stored procedures and functions for back end.
- Prepared the functional, design and test case specifications.
- Performed unit testing, system testing and integration testing.
- Developed unit test cases. Used JUnit for unit testing of the application.
- Provided Technical support for production environments resolving the issues, analyzing the defects, providing and implementing the solution defects. Resolved more priority defects as per the schedule.
Environment: Java, Spring MVC, Struts, Hibernate, JSP, Servlets, WebServices, Apache Tomcat, Oracle, JUnit and SQL.
ConfidentialSoftware Trainee
Roles and responsibilities:
- Involved in the complete development, testing and maintenance process of the application.
- Responsible for gathering the requirements doing the analysis and formulating the requirements specifications with the consistent inputs/requirements.
- Developed JSP as an application controller.
- Designed and developed HTML front end screens and validated forms using JavaScript.
- Used Frames and Cascading Style Sheets (CSS) to give a better view to the Web Pages.
- Deployed the web application on Web Logic server.
- Used JDBC for database connectivity.
- Developed necessary SQL queries for database transactions.
- Involved in testing, implementation and documentation.
- Written Java script code for Input Validation.
- Front End was built using JSPs, JavaScript and HTML.
- Built Custom Tags for JSPs.
- Built the report module on reports based from Crystal reports.
- Integrating data from multiple data sources.
- Generating schema difference reports for database using toad.
Environment: Java, JSP, Web Logic 5.1, HTML, JavaScript, JDBC and SQL, PL/SQL and UNIX.