- Software Developer with 8 Years of Software Development experience in the designing, development and testing of high performance and scalable applications using Java and Big Data technologies.
- Played a pivotal role in building centralized Enterprise Data Hub using Hadoop Platform that can cater to all the data analytical needs of an Enterprise.
- Good experience in creating complex data ingestion pipelines, data transformations, data management and data governance in a centralized Enterprise Data Hub.
- Created reusable Encryption Codecs using RSA, AES 128/256 bit encryption algorithms that could be used to perform transparent encryption/decryption of data.
- Good experience in converting Business Ideas to workable solutions and suggesting the right tools/solutions for the right problems by working closely with the Business teams.
- Excellent understanding/knowledge of Hadoop Distributed system architecture and design principles.
- Have good experience creating real time data streaming solutions using Apache Spark/Spark Streaming, Kafka and Flume.
- Extensive hands on experience in writing complex Mapreduce jobs, Pig Scripts and Hive data modeling.
- Very good knowledge on usage of various big data ingestion techniques using Sqoop, Flume, Native HDFS API, REST API, HttpFS.
- Experience developing proof of concepts, reference architectures and other collateral leveraged experience of working in global delivery model.
- Experience in Design, Installation and Administration of several Hadoop distributions in physical, Virtual and cloud environments.
- Experienced in Java Application Development, Client/Server Applications, Internet/Intranet based applications using Core Java, J2EE patterns, Web Services, REST services, Oracle, SQL Server and other relational databases.
Programming Languages: Java, Unix Shell Scripting, SQL, Perl
Frameworks: Spring MVC, Hibernate
Big Data Technology’s: Hadoop - YARN, MapReduce, Hive, PIG, HBase, Kafka, Spark
Web Technologies: HTML, Java Script, JQuery, Ajax
Version Control: Git, Subversion, Clearcase
Software Testing Frameworks: Jenkins, Cobertura, JUnit
Databases: Oracle, MySQL, HBase
Build tools: Maven, ANT, Gradle
Operating Systems: UNIX, Windows
Systems Engineer/ Senior Java Developer
- Worked on identifying authorized and unauthorized public cloud services, consumption (destination and traffic volume) in an organization based on the traffic that flows through the Cloud Consumption Collector installed in their network.
- Reports security risks associated with using discovered public cloud services, Identifies anomalies based on usage levels
- Identify Authorized/Unauthorized Cloud Service Provider's
- Differentiate various cloud Service Models (IaaS/PaaS/SaaS)
- Categorize Industry standard (NIST) Cloud Service Categories
- Monitor Data traffic
- Identify Trends of Traffic/Source IP’s and generate alerts
- The data that are stored on HDFS were preprocessed/validated using PIG then the processed data is stored into Hive warehouse which enabled business analysts to get the required data from Hive
- Prepared complex Hive queries and involved in data loading and performed various ad-hoc reports.
- Built re-usable Hive UDF libraries for business requirements which enabled users to use these UDF’s in Hive querying.
- Provided quick response to ad hoc internal and external client requests for data and experienced in creating ad hoc reports.
- Created working POC’s using Spark Streaming and Kafka for real time stream processing of continuous stream of large data sets.
- Created data ingestion plans for loading the data from external sources using Sqoop, Flume, Kafka and HDFS.
- Technical development and part of Big Data Center of Excellence (COE) responsible for creating technical guidance, road map and strategies in delivering various big data solutions throughout the Organization.
- Supported code/design analysis, strategy development and project planning.
- Designed and developed multiple MapReduce jobs in Java for complex analysis.
- Wrote Pig Scripts to generate MapReduce jobs and performed transformation procedures on the data in HDFS.
- Processed HDFS data and created external tables using Hive, in order to analyze spikes, faults and customer experience
- Developed multiple MapReduce jobs for data cleansing & preprocessing huge volumes of data.
- Optimize to dynamically chosen amount of resources based on amount of data or number of days to search.
- Created standards and guide lines for the design and development, tuning, deployment and maintenance of MongoDB database.
- Implemented test scripts to support test driven development and continuous integration.
- Involved in various performance tuning activities.
Analyst Software Developer
- Work on flow-through Systems for identifying various network elements.
- Create RESTful web services and deploy/maintain for various systems.
- Generate automated support tickets if network topology identification fails.
- Database and service layer Migration from Oracle to MongoDB.
- Involve in Proactive production support to ensure SLA’s for the software.
- Coding using Java and Java script for new enhancements and development supports.
- Reported, tracked and fixed the defects using defect tracking tool.
- Gathering & Understanding the functional specification from the functional document.
- Involve in Functional Requirement and Design walkthrough with the Business Analyst.
- Participation in design and implementation plan reviews with MasterCard to understand new business requirement.
- Involved in doing basic Performance testing using shell scripts.
- Developed and enhanced test scripts to validate data in IRIS database and front end Dashboard
- Involved in creating various database configuration scripts using Oracle and SQL Server.
- Coding using C++, Java and UNIX shell script.
- Analyzing the requirements, understanding the requirements from BA team, Technical design document preparation and UTP preparation.
- Releasing the code for build.
- Worked on Applications that Store, indexes, and aggregates data in real-time.
- Produces real-time statistics and reports (1 min, 10 min, hourly, daily and weekly).
- Dashboards for Hourly/Daily Reports, Searches, Sampling, Site Statistics, Real Time Traffic Analysis, Monitors data and generates alerts.
- Implemented Payment Reports for the Big Merchants using PayPal and VeriSign Gateways for paying monthly bills, Payment declines and settlements.
- Provide CSV and XML data for the Transactions in Confidential .
- Finding Minimum, Maximum, 95th and 99th Percentiles, Median and Standard Deviations for execution times for a URL and Database transaction time with millions of data points with repeated data.
- Creating Transaction Reports for Short URL, Referrer, Browser, SQL and URL Transactions.
- Advanced Message Queuing Protocol for reliable message transfer between processes across multiple data centers.
- Developed a POC for Point of Service as part of location based services using PayPal Payment Intermediation process.
- Implement Regular expression based search for Confidential transactions.
- Involved in design, analysis, development, testing and deployment activities.
- Oracle used as the relational backend.
- Involved in preparing the screen design.
- Development of JSP pages using Backbase JSF Framework
- Involved in Code Review.
- Releasing the code for deployment. Designed and Developed Test Plan/Test cases for Quality Analysis Team. Also Actively Participated in Writing Developer Unit Test cases.
- Involved in Quality Analysis and Testing of the software product in various Software Cycle.