Bigdata Developer Resume
SUMMARY:
- 10+ years of application design development in Bigdata/ Java/J2EE
- 3+ years of Experience in Bigdata Ecosystems
- Experience in Spark, Scala Pig, Hive, MapReduce, Sqoop, Oozie
- Experience in NoSQL databases (HBase, Cassandra)
- Experience in the Cloud Era and AWS(EMR, S3,..)disrubutions
- Experience in End to End deliver of Products, migration and application porting in various environments.
- Extensive experience in analysis, design, development, testing, maintenance, performance tuning and deployment of internet, e - commerce and client-server applications.
- Delivered multiple end to end products from scratch to production and Collaborate with Product Owner and Stakeholders to identify Use cases, User Stories, guide scrum teams for agile development practices.
- Build POC’s, Team leads, mentor junior developers, QA teams and production support on and off-shore.
- Expertise in executing UML artifacts (Use Case, BRD etc.), JIRA and project management tools.
- Expertise in MVC frameworks ( Struts 1.2, Struts2, Spring MVC, Spring Webservices, Struts-Spring integration ) JSP and Core Spring module, JBDC templates, spring DAO support.
- Data modeling and ORM mapping technology (Hibernate, Ibatis), building cache and writing SQL in DB2, SQL/Stored.
- Procedures in Oracle 9i, 10,11g PL/SQL, Web logic 8, 9 Tomcat 5.xx, WAS 6.0/7.0
- Experience in UNIX, shell scripting,
- Expertise in the Webservices( SOAP, RESTful)
- Experience in SPLUNK, ITRS log monitoring tools
- Experience in the Banking Domain (Credit Cards/Mutual Funds, Core Banking)
TECHNICAL SKILLS:
Scala, Spark, MapReduce,PIG,Hive,Sqoop,Oozie,Hbase,Cassandra,J2EE, Java5/8, JSP, Servlets, AJAX, JDBC, Spring, Hibernate, Ibatis, JMS1.0.2, JNDI, J2EE, Oracle, SQL, Oracle 10g, 11, PL/SQL, Linux, IBM Web Sphere 5.0/6.0, BEA WebLogic 9, Apache Tomcat 5.1/7.0, My Eclipse 3.xx, Ant, JUnit, log4j, Eclipse 3.x, IntelliJ idea, ServiceNow
PROFESSIONAL EXPERIENCE:
Confidential
Bigdata Developer
Responsibilities:
- Coordinating with BI team to gather requirements for various data mining projects.
- Configured Spark streaming to get ongoing information from the Kafka and stored the stream information to HDFS and Cassandra.
- Used Spark-Streaming APIs to perform necessary transformations and actions on the data got from Kafka and Persists into Cassandra.
- Designed, developed data integration programs in a Hadoop environment with NoSQL data store Cassandra for data access and analysis.
- Used various Spark Transformations and Actions for cleansing the input data.
- Used DataStax Spark-Cassandra connector to load data into Cassandra and used CQL to analyse data from Cassandra tables for quick searching, sorting and grouping.
- Load and transform large sets of structured, semi structured and unstructured data.
- Used Spark Streaming APIs to perform transformations and actions on the fly for building common learner data model which gets the data from Kafka in near real time and persist it to Cassandra.
- Processed the real time steaming data using Kafka, Flume integrating with Spark streaming API.
- Consumed JSON messages using Kafka and processed the JSON file using Spark Streaming to capture UI updates.
- Experienced in writing live Real-time Processing and core jobs using Spark Streaming with Kafka as a data pipe-line system.
- Used Kafka Streams to Configure Spark streaming to get information and then store it in HDFS.
- Worked extensively with Sqoop for importing metadata from MySQL and assisted in exporting analysed data to relational databases using Sqoop.
- Created Hive tables as per requirement, internal and external tables are defined with appropriate static and dynamic partitions, intended for efficiency and bucket to and write HQL scripts to perform data analysis.
- Worked on Hive optimization techniques using joins, sub queries and used various functions to improve the performance of long running jobs.
- Optimized Hive QL by using execution engine like Spark.
- Environment: Apache Spark, Apache Kafka, MLLIB, Scala, Cassandra, Hive, Sqoop, Flume, Hadoop, HDFS, Scala, Oozie, MySQL, Oracle 10g.
Confidential
Analyst Engineer
Responsibilities:
- Experienced in migrating HiveQL into Spark SQL into Spark engine to minimize query response time.
- Used Spark Streaming to divide streaming data into batches as an input to Spark engine for batch processing, development and production.
- Handled importing of data from various data sources, performed transformations using Hive, Spark and loaded data into HDFS.
- Developed multiple spark Jobs in Scala for data cleaning and pre-processing.
- Used flume, Sqoop, Hadoop, spark and Oozie for building data pipeline.
- Cluster coordination services through Zookeeper.
- Automated all the jobs, for pulling data from FTP server to load data into Hive tables,
using Oozie workflows.
Environment : Apache Spark, Scala, Cassandra, Hive, Sqoop, Flume, Hadoop, HDFS, Scala, Oozie, Oracle 10g.
Confidential
Bigdata Developer
Responsibilities:
- Involved in analysing client’s online shopping site’s product reviews and Comments of Users.
- Making an API call and get the client’s data.
- Ingesting XML in Hive using XPath.
- Fattening JSON files using PIG and storing the data in HIVE tables.
- Involved in importing and exporting data from existing Teradata and DB2 systems to Hadoop file system using SQOOP.
- Involved in writing Hive queries to load and process data as per the business requirement in Hadoop File System.
- Scheduling the system to manage Apache Hadoop jobs in Oozie workflow.
- Modifying the Data Model as per current Sprint Requirements
- Finding solutions for upcoming Sprints by doing POCs.
Confidential
Sr. Java Developer
- RTB (Run the Business) activities i.e. design, develop, test and deploy the new enhancements of future releases of Master Pass & PM3 Projects.
- PM3 is a platform that accelerates Product time-to-market. PM3 provides an enterprise scale solution for Customer, Product, and Order life cycle management that can be easily extended for customer
- Specific requirements. Taking care of defining the new process and onboarding to the new applications through the PM3 tool.
- Coordinate with business clients, support the application and implement the process as per
- Specifications provided.
- Involved in the Build process and continuous integration by using Maven Builds.
- Involved in the single and multilevel Maven builds
- Involved in the Development of web services (JAX WS)
- Involved in the development of Module (Spring IOC, DAO, ORM.MVC, JDBC….)
Environment: Spring 4, Hibernate 3, Tomcat 7, JIRA and SCRUM, GIT, Bamboo, postgres 1.2, Oracle 11, SOAPUI, REST, Angular
Confidential
Sr. Consultant
- Involved in architect the multi-phase, multi-project Service Oriented Architecture (SOA) Identity and Access Management project.
- Involved in exposing Identity Management components like password reset, resetpin, token API, bulk uploads etc as SOAP webservices to the client. Created XSD’s and SOAP end-points
- Worked on Spring IOC, MVC, JSP, Jquery and Javascript on the front end. Designed safe Java API to avoid cyber threat and vulnerability.
- Involved in designing of DAO based spring classes and new database tables in Oracle and execute queries using JdbcTemplate, SQL fine tuning and indexing ds and requirements. Resolves issues in an appropriate and timely manner and worked with offshore teams.
- JUnit and SOAP UI Testing and UNIX deployment to Tomcat.
Environment: RAD7.5, JDK1.6, JSP, STRUTS, Spring, WebSphere 6.0 server, Windows XP, Unix,
Maven Tool, Log4j, JAX RPC Web Service, Apache Axis, RTC, SSH Tectia, SSh Client .
Confidential
Senior Software Engineer
- Was Involved in various phase of the project mainly Development and Maintenance of the application
- Involved in the development of prototype
- Designed and developed screens using Java Server Pages.
- Used JavaScript to do the validation on the client end side.
- Developed Js Files
- Developed DAO’s, DTO’s, Action Classes
- Server-side validations for client Data
- Developed Struts framework for processing the user requests and interfacing with business logic.
- Involved in the development of Unit Test Cases
- Involved in the Low-level Design Doc Preparation
Environment: Eclipse, JDK1.5, JSP, STRUTS, Spring, TomCat 5.x server, Windows X, Ant, Log4j
Confidential
Associate Consultant
- Was Involved in various phase of the project mainly Development and Maintenance of the application
- Designed and developed screens using Java Server Pages.
- Used JavaScript to do the validation on the client end side.
- Developed. Js Files
- Developed DAO’s, DTO’s, Action Classes
- Server-side validations for client Data
- Developed Struts framework for processing the user requests and interfacing with business logic.
- Involved in the development of Input Case Module, Split Ticket Module
Environment: RSA, JDK1.4, JSP, STRUTS, ORACLE 9i Java Script, RTC, WebLogic 8.x.
Confidential
JAVA Developer
- Involved in the development of action classes, service layer classes, Dao layer classes
- Enhance the existing functionalities of mfts Administration Module.
- Enhance the existing functionality of Account managing module.
- Developed interaction mechanism to another Application.
- Worked on cross browser scripting.
- Involved in client side and server-side validations.
Environment: : JDK1.4, JSP, STRUTS, Tomcat, Oracle 9i, Java Script, CVS,, Eclipse, TOAD, VPN.
Confidential
Developer
- Designed and developed screens using Java Server Pages.
- Developed BOM Objects.
- Developed the process of Daily data loading of transactions.
- Server-side validations for client Data.
- Client-side validations util Files using JavaScript
- Developed Action Classes and Action Forms
Environment: Web Logic JavaScript, JDK1.5, JSP, STRUTS, Oracle10g, Top Link, Java Script, and EJB
Tools: Eclipse 3.2, Log4J, CVS, SQL Developer, Axis.
