Cloudera certified Hadoop and Spark Developer with 3 years of experience in application development using Java technologies. Hands - on experience in requirement gathering, project design, development and maintenance with Agile Software Development methodologies. Quick in adopting new framework.
- Hadoop, Hive, Impala, Sqoop, Kafka, Oozie, Spark Core, Spark SQL, Spark Streaming
- Java (version 7,8), Scala, Groovy, Stored Procedures, Excel VBA
- Spring MVC, Hibernate, JPA, Maven, Junit, Restful Web Services
- Using Jersey, Spring Batch
- My SQL, Oracle
- Toad for Oracle, Jenkins, Sonar, PuTTY, JIRA, Confluence, RTC, Jupyter Notebook, Zeppelin Notebook, Talend, Subversion, Git
- S3, EMR
- Excel file was used as the user interface for inputs - location of the data sources in HDFS, information for transformation reconciliation and export and to trigger the application.
- Excel VBA was used to read the information from excel file and trigger the .bat file for the application.
- Performance Testing was done in AWS EMR.
- Amazon S3 was used for storage of test data.
Technologies Used: HDFS, Spark SQL, Spark Core, JIRA, Git, AWS services - S3, EMR
- Prime Objective was to automate the Financial Reporting Application which was used to generate Financial Statements of various funds.
- This application helps the client in Consolidation and Combination of financial reports and then generating disclosures from client’s trade capture system for Audit & compliance purposes with accuracy.
- This application handles the complex business flows and the reports can be generated in much less time and with more efficiency.
Technologies Used: Java 8, Restful Web Services using Jersey, Angular JS, Spring MVC, Spring Batch, Spring Integration, JPA, Stored Procedures, Subversion, Oracle, Junit, RTC, Maven.
Java Developer Intern
- Information uploaded on the portal at the time of onboarding of any employee to Confidential account was stored in MySQL which was later fetched at the time of Background Check and Billability.