- • Process oriented Big Data Developer with 5 years of IT experience.
- • Excellent knowledge in Hadoop ecosystem, HDFS, MapReduce, Spark, Hive, AWS - EMR, S3, Athena, Kinesis, Glue, Flume, Sqoop, HBase, Confidential .
- • Expert in manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources.
- • Excellent experience in creating Data pipelines, data lake, ingestion and transformation.
Big Data: Spark, Hive, Sqoop, Flume, HDFS, Hadoop, Confidential, AWS-S3,EMR,Athena,EC2, Glue, Kinesis
Languages: Java, Python
Database: MYSQL, Oracle SQL, MSSQL, MongoDB, HBase
Operating Systems: Windows (Windows 7,10), Unix, Linux.
Design and Development Methodologies: OOD, UML, Design Patterns, Scrum and Agile.
Big Data Developer
- The motivation of this project was to build the dashboard to analyze the near real time data.
- Built data pipeline to get the web feeds from Confidential to spark.
- Used spark streaming to get the real time data for every minute using spark s Confidential utils.
- Designed data aggregations for every minute using Spark Datasets and wrote the dataset results to MySql.
- The site dashbaord read the data from MySql using Rest API.
Big Data Developer
- Building data pipelines on the AWS and load data into S3.
- Transform and cleanse the data using AWS EMR using Spark and Hive.
- Perform SQL queries on AWS with Athena and RedShift.
- AWS Glue is used to explore the schema of the data as it arrives.
- Implemented the request and response sent to AJB provider from/to POS application.
- Implemented online and offline scenarios of payment.
- Integrated AJB payment forms on the pin - pad device.
- Implemented the Donation Request requirements, item scrolling on pin-pad device and receipts printing based on EMV receipt guidelines.
Environment: Java, AJB-FiPay Server, Oracle SQL, log4j, TCP sockets, XML, Ant build, Opchain Framework..
- Developed OVAL definitions in XML that uses extensive WMI queries to get inventory details of a windows server such as DNS, Disk, Processor, Memory, Interface, Operating System and Active Directory
- Developed routines to read records from DB using Data Access Layer, to select appropriate OVAL definition from property file, to run jOVAL interpreter on windows server and to retrieve data from jOVAL objects
- Parsed jOVAL objects to obtain POJOs and persist those POJOs into DB using the Data Access Layer
- Used Maven to build and integrate with sub modules of the Discovery Engine.
Environment: Java, XML, XSD, WMI, DOM parser, JAXB parser, MongoDB, putty.