Full Stack Developer Resume
Mooresville, NC
PROFILE:
Java Engineer with Back End skills.Experience in Spring Boot, Hibernateand Big Data tools.
PROFESSIONAL SUMMARY:
- 10+ Years of experience in information technology out which 8+ years’ experience on major components in Java, Spring, Hibernatewith experience also inHadoop Ecosystem on Back End.
- Front End integration using Angular and Neo4J for UI.
- Self - starter, lifelong learner, Team Player, excellentcommunicator.
- Well organized with great interpersonalskills.
- Unix shell scripting.
- Strong understanding of Automation Testing Tools: Junit, Selenium, cucumber, Serenity and Mockito.
- Experience on importing and exporting data, developing in Spring and Kafka.
- Hands on Docker containers.
- Experience in CI/CD as Jenkins, Puppet, Chef.
- Python and Scala for Spark applications.
- Extensively worked on build tools like Maven, Log4j, Junit andAnt.
- Designing and implementing of secure Hadoop cluster usingKerberos.
TECHNICAL SKILLS:
Experience with Cloud space: Cloud services using AWSEMR andS3, Azure, Anaconda, Elasticsearch, Lucene, Cloudera, Databricks, Hortonworks. SQL, Apache Cassandra, Apache Hive, MongoDB, Oracle SQL Server, DB2, RDBMS, MapReduce using formats like Parquet, Avro, JSON, Snappy compression format.
Java frameworks implementation using: Java EE, Spring, Hibernate, Micro services, Automated testing, Javascript with NodeJS and Angular, ReactJS, Apache Airflow, Apache Cassandra, Apache Hadoop, Apache Kafka, Apache Maven, Apache Spark, Cloudera, HDFS, Hortonworks, Elasticsearch, Elastic Cloud, Sqoop, Kibana, Tableau, AWS, Cloud Foundry, GitHub, BitBucket, Oracle, andSplunk, Unix, MS office, Teradata.
Experience with work methodologies: Agile, Kanban, Scrum, DevOps, Continuous Integration, Test-Driven Development, Unit Testing, Functional Testing, Design Thinking, Lean andSix Sigma BI (Business Intelligence) reports and designing flows sending data to reporting tools likeTableau or QlikView.
PROFESSIONAL EXPERIENCE:
Full stack developer
Confidential, Mooresville, NC
Responsibilities:
- Design and development of applications using J2EE, Spring Boot, RESTful services and Angular.
- RESTful implementation and integration with Neo4J and Spring Boot.
- Using REST and GraphQL to create Cassandra data services.
- Implementation of automation Testing creating use Cases for Selenium, and Cucumber.
- For data engineering process was needed to implement Oozie jobs to extract data periodically.
- Memcache adoption using Hazelcast, exploring other options like Redis.
- EJB and Hibernate implementation.
- Working with SonarQube focusing on Code Quality Rating.
Hadoop developer
Confidential, Atlanta, GA
Responsibilities:
- Design and development of applications using J2EE, Spring Boot, RESTful services and Angular.
- Using REST and GraphQL to create Cassandra data services.
- Implementation of automation Testing creating use Cases for Selenium, and Cucumber.
- Building a RESTful application using Spring Boot and then linked to a responsible web application.
- Design and development of Micro services using Spring Boot and REST API.
- DevOps tool chain utilization: Jenkins, Frog Artifactory, Spinnaker.
- Creation of scripts in Unix.
- Hibernate integration.
- Cloud-based implementation mostly in AWS for Production environment, GCP for Dev purposes.
- Angular and Neo4J implementation.
- Kafka implementation into GCP.
- Memcached development using Redis and exploration of other alternatives like Hazelcast.
- AWS utilization for API gateway configuration, Cloud Formation and AWS Lambda.
- Application integrations using SOAP/REST along with ESB and JMS for file/data transfer.
- Manipulating data files in HDFS and creating Spark jobs.
Java Back-End Developer with Hadoop
Confidential, Bentonville, AR
Responsibilities:
- Worked closely with the Source System Analysts and Architects in identifying the attributes and to convert the Business Requirements into Technical Requirements and Jenkins integration.
- Implemented service layer classes using Spring IOC and AOP classes.
- POC using Real time integration with Spring - Kafka integration from different Data Streams sending into structured streaming using Spark into Microsoft Azure-Databricks.
- Implemented and maintained AJAX based rich client for improved customer experience.
- Developed Java Messaging Service JMS with MessageDrivenBeans by configuring JMS Queues, Topics, and Connection Factories.
- Created Hive tables using Java API. Making integration with Hive tables as per the design using ORC file format and Snappy compression.
- Worked on development of critical components of the application including Spring forms, Spring controllers and JSP views paralleling to business logic and data logic components Hibernate entities and Spring models following MVC architecture.
- CRUD operations developed in Hibernate
- Used Spark SQL to perform transformations and actions on data residing in Hive.
- Created UNIX shell scripts to automate the build process, and to perform regular jobs like file transfers between different hosts.
- Executed tasks for upgrading clusters on the staging platform before doing it on production cluster adding JAX-RS implementations.
Java Developer with BigData
Confidential, Phoenix, AZ
Responsibilities:
- Actively involved in setting up coding standards, prepared low and high-level documentation. Involved in preparing the S2TM document as per the business requirement and worked with Source system SME's in understanding the source data behavior.
- Spring - Kafka implementation focusing in Kafka-Streams to bring data for mining and storing activities.
- Implemented Web services components SOAP, WSDL, and UDDI to interact with external systems under Spring.
- Developed JUnit test framework and executed unit test cases by using JUNIT for fixes.
- Developed server side presentation layer using Struts MVC2 Framework.
- Used various design patterns like Business delegate, Singleton, Factory, DAO, DTO and Service locator.
- Performed Inheritance based OR mappings in tables to simplify the data in Hibernate.
- Wrote different UDF's to convert the date format and to create hash value using MD5 Algorithm in Java and used various UDF's from Piggybanks and other sources.
- Used Spring IOC, Autowired POJOs and DAO classes with Spring Controller
- Worked with EQM and UAT teams for fixing the defects immediately by understanding theissue. InvolvedinUnitlevelandIntegrationleveltestingandpreparedsupportingdocumentsforproper deployment using Selenium in Java.
- Responsible for gathering requirements to determine needs and specifications to write a project plan and architecture schematic.
- Developed a procedural guide for implementation and coding to ensure quality standards and consistency.
- Involved in elaboration, construction phases and deployments of EAR files in the RUP process.
- Designed and Created Domain model and schema using object oriented design / UML diagrams
- Used the image files of an instance to create instances containing Hadoop installed and running. Developed dynamic parameter file and environment variables to run jobs in different environments.
- Worked on installing clusters, commissioning & decommissioning of data node, configuring slots, and on name node high availability, and capacity planning.
Java Big Data Engineer
Confidential, Providence, RI
Responsibilities:
- Migrating the needed data from Oracle, MySQL into HDFS using Sqoop and importing various formats of flat files into HDFS.
- Uploaded and processed more than 30 terabytes of data from various structured and unstructured sources into HDFS (AWS cloud) using Sqoop and Flume.
- Designing and creating Hive external tables using shared meta-store instead of Derby with partitioning, dynamic partitioning, and buckets.
- Used Solr to enable indexing for enabling searching on Non-primary key columns from Cassandra keyspaces.
- Developed custom processors in Java using maven to add the functionality in Apache Nifi for some additional tasks.
- Developed Action classes, ActionForms and Struts Configuration file to handle required UI actions, JSPs for Views.
- Used JavaScript for Client Side validation in JSP pages.
- Integration of Payment Gateway using SOAP WS for Auto Payment flow.
- Used Tableau for data visualization and generating reports.
- Developed Spark code using Java and Scala and Spark-SQL for faster testing and data processing. WorkedonconvertingPL/SQLcodeintoScalacodeandalsoconvertedPL/SQLqueriesintoHQL queries.
Sr. Java Developer
Confidential, Littleton, CO
Responsibilities:
- Worked with the team to gather and analyze the client requirements. Analyzed large data sets distributed across cluster of commodity hardware.
- Connecting to Hadoop cluster and Cassandra ring and executing sample programs on server.
- Bulk loaded data into Cassandra using Stable loader
- Built-in Request builder, developed in Scala to facilitate running of scenarios, using JSON configuration files
- HDFS maintenance and loading of structured and unstructured data
- Data was formatted using Hive queries and stored on HDFS
- Coordinated with Java team in creating MapReduce programs.
- Implemented the project by using Spring Web MVC module
- Responsible for managing and reviewing Hadoop log files.
- Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, and other tools.
- Followed Agile methodology, interacted directly with the client provided & receive feedback on the features, suggest/implement optimal solutions, and tailor application to customer needs.Worked on risk management applications and data processing applications for the banking industry.
- Implemented J2EE design patterns like MVC and Front Controller.
- Implemented Static and Dynamic web pages using JSP, JavaScript, CSS. Involved in Requirement analysis, design and provide the estimation.
- Responsibilities include designing and delivering web-based J2EE solutions. Used JavaScript for client- side validations.
- Involved in writing PL/SQL queries and stored procedures. Responsible for setup the environment and Production Environments in Server and Database level.
- Involved in developing portlets and deploying in Weblogic Portal Server.
- Involved in writing of release notes to deploy in various environments and production. Monitored the Server load average and prepare a status report.
- Point of Contact to the client for all technical aspects. Prepared status reports.
Jr. Java Developer
Confidential, Birmingham, AL
Responsibilities:
- Used Hibernate ORM tool as persistence Layer - using the database and configuration data to provide persistence services (and persistent objects) to the application.
- Implemented Oracle Advanced Queuing using JMS and message-driven beans.
- Responsible for developing DAO layer using Spring Web MVC and configuration XML's for Hibernate and to also manage CRUD operations (insert, update, and delete)
- Implemented Dependency injection of spring framework. Developed and implemented the DAO and service classes.
- Developed reusable services using BPEL to transfer data. Participated in Analysis, interface design and development of JSP.
- Configured log4j to enable/disable logging in the application.
- Wrote SPA (Single Page Web Applications) using RESTful web services. Developed Rich user interface using HTML, JSP, AJAX, JSTL, JavaScript, jQuery and CSS. Implemented PL/SQL queries, Procedures to perform data base operations.
- Wrote UNIX Shell scripts and used UNIX environment to deploy the EAR and read the logs. Implemented Log4j for logging purpose in the application.
- Modified existing Java APIs on Performance & Fault management modules.
- Took ownership of Implementing & Unit testing the APIs using Java, Easy Mock, and JUnit.
- Involved in Build Process to package & deploy the JARs in the production environment. Involved in Peer Code Review processes and Inspections. Implemented agile development methodology.