- Over 7 years of experience in handling large development projects, Analysis, Design, Development.
- Highly experienced and skilled Agile Developer with strong record of excellent teamwork and have exposure to all phases of Software Development Life Cycle including feasibility study, requirement Analysis, Design, Development and Implementation of Java, Python Projects.
- Good Experience in installing, configuring Hadoop and its related components.
- Have hands on experience in writing MapReduce jobs in Java, Pig and Data Processing using Spark Scala (used RDD and Dataframe)
- Experience in Apache NiFi tool to automate data flow between systems.
- Proven strength in Web Based Application using Node JS, spring, Play Framework.
- Extensive experience in Sql and NoSql Database like Oracle, Mysql and MongoDB, CouchBase, Redis, Elastic Search, HDFS, Hive
- Strong initiative, teamwork, responsible, communication, analytical and problem solving skills.
Programming Languages: Java, C++, Pig Latin
Scripting Languages: Python, Shell
Framework: Hadoop, Spark, Node JS, Play, Spring (Spring Boot, Spring MVC, Spring JPA)
Workflow Scheduler: Oozie
Web Server: Nginx, Apache Tomcat
Operating Systems: Linux and Windows Family.
SQL Database: Oracle 10g, MySQL
NoSQL Database: MongoDB, CouchBase, Redis, Elastic Search, HDFS, Hive
Graph DB: Neo4J
Message Broker: RabbitMQ
Markup Language: HTML, XML.
Deployment Tools: Hudson, Jenkins, Team city
Version Control System: CVS, Mercurial, GIT
Build Tool: Maven, SBT
Data Transfer Tool: Sqoop, Flume, Apache Nifi
Product Management Tool: Jira, Rally
Confidential, Cincinnati, OH
Big Data/Hadoop Developer
Technology: Hadoop, Apache NiFi, Oozie, Spark, Scala, Hive,Python, Java, Spring Framework, Shell Scripting, RabbitMQ,RallyResponsibilities:
- Involved in the designing phase of Confidential and Data Transformation Service, and contributed solutions for automation.
- Developed and Automated Confidential using Apache NiFi to ingest data between systems and notify other Services about the ingestion and
- Used RabbitMQ for messaging service to get down streams notified about the data ingested.
- Pre - Processed the data ingested using Apache Pig to eliminate the bad records as per business requirements with the help of filter functions, User Defined Functions.
- Developed and Automated Data Transformation Service using Spring Boot which triggers workflow Scheduler Oozie to transform data using Spark Scala.
- Worked on Apache Spark component in Scala to do transformations.
- Worked on Oozie workflows for automation of Pig, Spark jobs and developed an Oozie Java Wrapper called from Pivotal Cloud Foundry (PCF).
- Configured Team city for Continuous Integration/Continuous Deployment (CI/CD).
- Responsible for writing validation script using Shell scripting and Python.
- Used GitHub for version control repository.
Senior Lead Developer
Technology: Java, maven, AWS SDK, Couchbase, Mysql, Amazon RDS, Life Ray, Play Framework, Spring Framework (Spring Boot, Spring MVC, Spring JPA), Hadoop, Hive, SqoopResponsibilities:
- Developed MapReduce program in Hadoop which perform data pruning and data deduplication of the external data and bank data.
- Developed Global bank portal and individual portal for each bank Using Liferay.
- Created Rest API using Play framework. These Rest API connects the portal and different process running in the Server (eg: Server set up process, map reduce process).
- Migrated Play Framework RestAPI into Spring framework for scalability, Using Spring Boot for standalone, Spring MVC, Spring Jpa for database communication.
- Responsible for writing ElasticSearch query for geo hash, fuzzy logic and auto complete. And Couchbase query for index search.
- Responsible for performing unit test and Load test using Apache Jmeter
- Experience in using Version control tool Git for code check-ins and code merge.
- Responsible for Setting up production environment and installing all necessary software.
Technology: Python, C++, Oracle and LinuxResponsibilities:
- Involved analysis of requirements & development, testing
- Involved in the creation of decision sets based on the different banker’s requirement
- Creating new attributes using Python as different rules.
- Unit testing the programs