- 8+years of experience in analysis, design and development of software applications using various technologies
- 4+ years of strong experience with Big Data and Hadoop Ecosystems.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS , Job Tracker, Task Tracker, Name Node, Data Node.
- Good experience creating real time data streaming solutions using Apache Spark Core, SparkSQL & DataFrames, and Spark Streaming.
- Hands on experience with Big Data core components and Ecosystem (Spark, Spark SQL, Spark Streaming, Hadoop, HDFS, MapReduce, YARN, Zookeeper, Hive, Hbase, Pig, Sqoop, Flume, Kafka, Storm, Oozie, python with CDH4&5 distributions and EC2 cloud computing with AWS ).
- Initially migrated existing MapReduce programs to spark model using python .
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Involved in Integrating Hive with Hbase, Pig with Hbase and Hive with Tez.
- Good Knowledge on NOSQL databases like as Hbase, Cassandra and Mongo DB.
- Experience in Hadoop Administration in Cloudera Distribution and have knowledge of HortonWorks Distribution.
- Good working experience using Sqoop to import data into HDFS from RDBMS and vice - versa.
- Good knowledge in using job scheduling and workflow designing tools like Oozie.
- Experience in developing MapReduce (YARN) jobs for cleaning, accessing and validating the data.
- Extensive experience in programming Android, Java, J2EE.
- Experience in Core Java with strong understanding and working knowledge of Object Oriented Concepts like Collections , Multi-Threading , Exception Handling and Polymorphism .
- Extensive experience with advanced J2EE Frameworks such as Spring , Struts , JSF and Hibernate .
- Worked proficiently in various IDEs including Net Beans and Eclipse
- Hands on experience working with databases like Oracle , SQL Server , DB2 and MySQL .
- Good understanding in Android OS , interactive application development and memory management.
- Experience in implementing native libraries in application by using Android SDK and Android NDK .
- Ability to adapt to evolving technology with strong sense of responsibility and accomplishment.
Big data Skills: Spark, Hadoop, HDFS, MapReduce, YARN, Zookeeper, Hive, Hbase, Pig, Sqoop, Flume, Kafka, Storm, Oozie.
Languages: Java, Android
Web Application Server: Apache Tomcat.
NoSql DataBases: Hbase, Phoenix, Cassandra, Mongo DB
Databases: Mysql, Oracle, Derby
IDE s: Eclipse, EditPlus
JavaEE Technologies: Servlet, JSP.
- Working as an Hadoop Developer in Confidential in Houston, Texas
- Worked as a Hadoop Developer in Confidential in Melbourne, Australia
- Worked as a Software Developer in Confidential
- Worked as a Software Engineer in Confidential
Environment: Hadoop, Hive, Zookeeper, Map Reduce, Sqoop, Pig, java, HDFS
Role & Responsibilities:
- Involved in Designing and Development of technical specifications using Hadoop technology.
- Analyzed client systems and gathered requirements.
- Move all crawl data flat files generated from various Micro Sites to HDFS for further processing.
- Involved in Writing Apache PIG scripts to process the HDFS data.
- Implemented HQL Queries for the reports.
- Created Hive tables to store the processed results in a tabular format. Implemented Auto mated transmitted scripts for Auto mate the Process.
- Played a key role in mentoring the team on developing MR jobs and custom UDFs.
- Creating Hive tables and working on them using Hive QL.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Developed Scripts to schedule the batch jobs.
- Helped the team in optimizing Hive queries.
- Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers with regular code review sessions.
- Weekly meetings with technical collaborators and active participation in code review sessions with junior developers.
Environment: Cassandra, Hive, Spark (Core, SQL, MLLib, Streaming)
Role & Responsibilities:
- Crawling of Data from 100 + sites based on ontology maintenance.
- Designed schema and modeling of data and written algorithm to store all validated data in Cassandra using Spring Data Cassandra Rest.
- To standardize the Input Merchants Data, Uploading images, index the given Data sets into Hsearch and persist the data on Hbase tables.
- Setting up the Spark Streaming and Kafka Cluster and developed a Spark Streaming Kafka App.
- Involved in data analysis using python and handling the ad-hoc requests as per requirement.
- Developing python scripts for automating tasks.
- Used AWS services like EC2 and S3 for small data sets.
- Generate Stock Alerts, Price Alerts, Popular Product Alerts, New Arrivals for each user based on given likes, favorite, shares count information.
Environment: Hadoop, HDFS, Hive, Hbase, Kafka, Storm, Rabbit-MQ WebStorm, Google Maps, New York City Truck Routes fromNYC DOT. Truck Events Data Generated using a custom.
- Developed a simulator to send / emit events based on NYC DOT data file.
- Built Kafka Producer to accept / send events to Kafka Producer which is on Storm Spout.
- Written Storm topology to accept events from Kafka Producer and Process Events.
- Developed Storm Bolts to Emit data into Hbase, HDFS, Rabbit-MQ Web Stomp.
- Hive Queries to Map Truck Events Data, Weather Data, Traffic Data
Admin and Developer
Environment: Hadoop, Hive, Flume.
- Trained the team on Hadoop and Eco System Components.
- Installed Apache Hadoop Cluster.
- Installed Flume agent on source and retrieved the incremental data to HDFS.
- Loading the Twitter data into HDFS using Flume
- Installed Pig, Hive for Analysis
- Inserting data into Hive using JSON SerDe
- Analyzing data for creating various reports
Environment: Java, NWDI, NWDS, Spring IoC
- Responsible for all java related activities like analysis, design and development.
- Participated R & D for two months and provided a solution for making this application work for upgraded backend SAP system. Now this solution is highly used in organization for SAP upgrade related projects.
- Further developments in this project using J2EE and related technologies
- Interacting with clients for giving demos and taking requirements, suggestions.
Environment: Android, Phonegap, Java, Servlets, Jsp, Oracle 10g
- Used RESTful Webservices, Struts, Servlets WSDL, XML, AJAX
- Responsible for Phonegap and Android installation
- Tested app using emulators and generated .apk file
- Responsible for database operations using Queries.
Environment: Core Java, spring, Maven, XMF Services, JMS, Oracle10g, Postgres Sql, 9.2, Fitnesse, Eclipse, SVN
- Involving in sprint planning as part of monthly deliveries.
- Involving in daily scrum calls and standup meetings as part of agile methodology.
- Good hands on experience on Version One tool to update the work details and working hours for a task.
- Involving in the designing part of views.
- Involving in Writing Spring Configuration Files and Business Logic based on Requirement.
- Involved in code-review sessions.
- Implementing Junit test’s based on the business logic w.r.t to assigned backlog in sprint plan.
- Implementing the Fixtures to execute the FitNesse test tables.
- Good experience on creating the Jenkins CI jobs and Sonar jobs.