Engineering And Development Resume
SUMMARY:
- Highly motivated and results - driven 'Development Expert' with IT experience of 5+ years in JAVA/J2EE, AWS Cloud and in Big-Data eco systems.
- Strategic and innovative individual who thrives on challenge and the opportunity to excel and deliver.
- Possess Experience in designing, developing and project implementation, support functions and software requirements study.
- Experience in Big data, Hadoop ecosystem such as MapReduce, Hive, Cloudera Impala, Oozie, Zookeeper, Sqoop, and Flume.
- Hands on experience in setting up of multi node Hadoop clusters, Cloudera clusters on data centers and AWS Infrastructure.
- Having good practical knowledge on AWS services like VPC, S3, SNS, SQS, lambda, EMR, CDN, RDS etc.
- Having good practical knowledge on effective ways to ingesting data into Cloudera ecosystem.
- Handled large datasets of structured and unstructured data from various data sources like webserver logs, AWS S3, AWS RDS etc.
- Hands on experience with technologies like Core Java, XML, JDBC, JNDI, JMS, Servlets, JSP, Webservices (SOAP and RESTful), EJB3, Hibernate 3.0, IBatis 2.0.6, Spring (DI, AOP, JDBC, ORM, MVC, Webservices), & Struts Framework,
- Proficient in functional languages like SCALA and No-SQL database like MongoDB in concurrency and analytics oriented application development.
- Involved extensively in all stages of software development lifecycle (SDLC) such as requirements gathering, design, development, testing, implementation & deployment.
- Guiding the teams on technical as well as business aspects.
- Team management (organize and support whole software development process: design, planning, estimations, development, bug fixing, refactoring, UAT, maintenance)
- Creative problem solver, with a keen attention to detail and quality. Also possess the ability to interact with individuals at all levels and also have the ability to work independently as well as in team environment with a high degree of initiative and motivation to serve client needs.
TECHNICAL SKILLS:
Operating Systems: Windows, Unix, Mac OSX, Ubuntu, Centos.
Programming Languages: Java, Scala, C#.
J2EE Technologies: XML, JNDI, JDBC, Servlets, JSP, JMS, EJB, Webservices (JAX-WS, JAX- RS), JAXB, Jackson.
BigData Ecosystem: HDFS, MapReduce, Hive, Sqoop, FlumeOozie, Zookeeper, Cloudera Impala, Spark.
NO SQL: Mongo DB, AWS DynamoDB
RDBMS: Oracle 11g, Amazon RDS (Oracle, PostgreSql, MySQL), PostgreSQL, MySQL.
Development Environment: Eclipse, Visual Studio, Web Matrix.
Frameworks: Spring Framework 3.X (Integration, Batch, AOP, Core, MVC Security), Struts Framework 1.2, 2, Java Seam, Play Framework (Scala and Java)
ORM Tools: Hibernate 3.0, IBatis 2.0.6, JPA, Anorm(Play Framework).
Application Server: Weblogic10, Websphere 6.1, Tomcat7.0, JBOSS.
UML Tools: Star UML, Microsoft Visio, UMLet.
Cloud Services: AWS VPC, EC2, S3, RDS, SNS, SQS, SES, ELB, Redshift, CDN, Elastic Cache, EMR, Auto Scaling.
Source Control: SVN, GIT and Code Commit.
Build and Continuous Integration Tools: JIRA, Atlassian Bamboo, Code Deploy, Maven.
PROFESSIONAL EXPERIENCE:
Confidential
Engineering and Development
Environment: Cloudera Impala, Hive, Hadoop, Sqoop AWS Cloudtrail logsAWS RDS, AWS S3, Flume, Sqoop, Spring REST.
Responsibilities:
- Installed and configured multi node Hadoop cluster of Linux test boxes
- End of day data from AWS Cloud Trail logs and S3 sources of logs of an AWS account almost upto 1TB of data loaded into the HDFS.
- Performed association mining with the given dataset. Processed the complete data using MapReduce to identify the items that are associated with each other
- Obtained critical outputs such as total no of activities for each account, total activities instance and service wise, total no of API calls across all services, each API utilization by system wise based on departments by processing the same data using HiveQL and analyzed the results
- Experience in analyzing data using Cloudera Impala, HIVEQL and custom MapReduce programs in Java. Extending HIVE by using custom UDF’s.
- Develop Map/Reduce routines to summarize large volumes of machine generated data and logs uploaded by the client.
- Upload generated semi-structure data to Hive database using the serialization and deserialization libraries.
- Load web log data into HDFS using Flume.
- Integration from various sources such as flat files, databases to HDFS.
- Involved in client calls for requirement gathering.
- Prepared HLD and LLD for entire application.
- Involved in Project Workspace Setup for Spring RESTful Webservices Spring REST.
- Involved in End to End development of the project.
Confidential
Engineering and Development
Environment: AWS S3, AWS EMR, AWS RDS, AWS SDK, Scala, MongoDBMySQL, Spring REST API, Hadoop, Hive, Impala, Jenkins.
Responsibilities:
- Solid development and adding features into this apps, data migration, Rest API to provide the service to application, full deployment cycle for production and staging using Amazon Cloud Web services (EC2, ELB,S3, RDS) Agile developments, access data releases from investment t research and management firm MorningStar.
- Crawler programs parse rss feeds from yahoo finance, etc, filter and store the news articles in hadoop.
- Clustering algorithms operate on the news articles to generate peer groups.
- Peer group info coupled with client portfolio generate news articles recommendations.
- Product info acquired from research firm coupled with clients’ portfolio generate product recommendations to Developed Junit Test cases for all Modules.
Confidential
Engineering and Development
Environment: Spring Core, Spring Rest, Spring Security, Spring integrationAWS-EC2, RDS, ELB, SNS, SQS, Angular JS, Spring MVC.
Responsibilities:
- Solid development and adding features into this apps, data migration, Rest API to provide the service to application, full deployment cycle for production and staging using Amazon Cloud Web services (EC2, ELB, S3, RDS) Agile developments, access data releases from gaming vendors like Xbox One and 360® and PlayStation® 3 and 4 .
- Hadoop on AWS infrastructure set up and Hive querying from large datasets on player’s skill information, matches played, statistics, most probable opponent.
- Leaderboard ranking preparation from data analysis on historical data of the players performance across various games and various platforms.
Confidential
Engineering and Development
Environment: Cloudera Impala, AWS java SDK, R framework.
Responsibilities:
- Aim is to run analytics on stack overflow public dataset and visualize the results
- Upload the stack overflow data related to posts (23 GB), users (1 GB), comments (6 GB) and votes (5 GB) to Amazon S3
- Initiate an EMR cluster bootstrapped with Cloudera Impala
- Using JAVA SDK - AWS to synthesize and convert data to a structured form to be loaded into HDFS via hive serde lib.
- Run various queries using impala and view various types of analytics on the data.
- Connect to the Impala endpoint from R console and visualize results as bar or pie charts.
Confidential
Engineering and Development
Environment: Java/J2EE, Struts 2.0, Spring JDBC, Jquery, My SQL, JAXBOpenSSO, OpenSAML, WebServices (Axis 2) AWS-EC2.
Responsibilities:
- Also contributed to some other in-house projects like Common Identity Exchange 2.0 (Research Product) and EZIAM - Cloud IAM Product on Amazon Web Services on cloud and development end.
Confidential
Engineering and Development
Environment: Web-Forms, Oracle, JQGRID(Jquery), C#, CSS, ASP.Net, JQuery.
Responsibilities:
- Involved in developing the user interface using Web Forms.
- Involved in validating the user Inputs using Action Script.
- Involved in writing business logic in the Service Classes.
- Involved in Requirement definition, UI Designing, DB(Oracle), Mail configurations and web publishing.
- Involved in Unit Testing and Integration Testing.
Confidential
Engineering and Development
Environment: JAVA (WEB Servlets & JSP), Oracle.
Responsibilities:
- Involved in validating the user Inputs using Java Script.
- Involved in developing the code for JDBC, Servlets.
- Involved in developing the JSP’s using Struts 1.2 tld files.
- Involved in developing Action Classes and Form Beans using Struts.
- Involved in generating Excel files using POI.
- Involved in generating PDF files using ITEXT.
- Used JFree Chart API to display the Data in Stacked Vertical Bar and Pie Charts.
- Involved in Unit Testing and Integration Testing.