Lead Java Developer Resume
SUMMARY
- Lead Java Developer with nearly 15 years of rich experience in designing and developing Java based application products/projects with leadership qualities.
- Expertise in driving product/project strategy and managing end - to-end Software Product Development from requirement analysis to system study, designing, developing, maintaining, implementing debugging, testing, documenting using Agile/Scrum, Linear Sequential, and Waterfall Software Development Methodologies.
- Having 7 years of Designing and Developing experience in Commodity XL/SL product development and experience in commodities products implementation across multiple product classes such as Oil, Metals, Gas US Power, and Ags, Etc.
- Rich experience in Designing and Developing Energy and Commodity, BFS and Telecom based products/projects
- Skilled in grasping the Big picture, Developing Solutions and Collaborating closely with business leaders and stakeholders and internal teams and clients.
- Possess particularly good developing experience in Object-Oriented Methodologies, N-Tier Architecture, Service-Oriented Architecture (SOA), Component Oriented Architecture, TDD (Test Driven Development), and Domain-Driven Design.
- Rich experience in developing and maintaining Java based application products using Core Java, Spring, Spring Boot, Hibernate, Rabbit MQ, Active MQ, Servlets, JSP, EJB, JDBC, JMS, Struts, JSF, Web Services and RESTful Web Services, ANT, Maven, Docker, and MVC Architecture Design.
- Rich experience in End to End System Architecture, Design, Implementation, Integration and Managing the projects/products from Inception to Production including from System Architecture to Release documentation writing and reviewing which includes Source System analysis, Amalgamation and Data Engineering which includes, data management, transformation, enrichment, mappings, modeling and consumption via Semantic layer, Reporting Solutions and downstream applications.
- Rich experience in developing Enterprise Integration Software using MuleSoft Platform and Enterprise Integration Patterns and Enterprise Application Architecture design.
- Exposure in developing C#, Python, Scala applications.
- Exposure in Microsoft Azure Cloud Technologies - Azure Data Factories, Azure Dev Ops, Azure Data Lake Storage Gen2, Azure Databricks, Azure SQL Data Warehouse, Azure SQL Database, Azure Data Architecture and AWS - AWS Cloud Formation, Compute (EC2), Storage (S3), and Database - RDS, and Apache Spark, and PySpark.
- Experience in Azure Cloud Technologies and implemented automation using Maven for build and Azure DevOps Pipelines for release management and using Azure DevOps Boards for Agile Development Process.
- Experience in developing applications with Oracle, SQL Server, My SQL Databases and NoSQL MongoDB.
- Experience in mapping clients’ business requirements and providing the best technical solutions and involved in definition of scope of project and finalization of project plans and interacting with customers/stakeholders to give proper updates regarding status of the project and delivery timelines; scheduling delivery management in coordination with client architecture teams and support groups.
- Experience in implementing service standards that serve as benchmark for excellent service delivery as per Global standards and in adherence with Service Level Agreements and leading and monitoring performance of the team members for maintaining excellence in the project operations; managing compliance to timelines specified and ensuring team members adhere to quality process guidelines.
- Self-motivated Team player with excellent interpersonal and communicational skills, Capable of performing in a fast paced, result driven atmosphere.
- Ability to handle multiple tasks and to work independently as well as in a team, experienced in interacting with Business/Technology groups.
TECHNICAL SKILLS
Languages: Java, J2ee, PySpark, C#, .Net Core, Python, Scala, UML, SQL
Frameworks: Spring, Spring Boot, Spring Data, Hibernate and Struts
Web/J2EE Technologies: HTML/HTML5, CSS/CSS3, XML, JSON, Servlets, JSP, EJB, JDBC, JMS, JSF
JavaScript Libraries: ReactJS, Node JS, Angular, Typescript.
Scripting Languages: PL/SQL Scripting, JavaScript, RAML, YAML
Web/Application Servers: Apache Tomcat 8, Weblogic12c, SAP Netweaver7.x AS, WildFly10.
Web Services: RESTful, SOAP
Enterprise Messaging: Active MQ, Rabbit MQ, WebLogic JMS, JBOSS JMS
Databases: Oracle 12c, SQL Server 2018, MYSQL 5
NoSQL Databases: MongoDB, DynamoDB, Cosmos DB
Commodities - Power, Gas, Oil, Ags, Emissions: Trade lifecycle Management, Risk Management, PhysOps, CashflowsConfirmations, Invoices, Payments, Positions/PL, Credit Risk, Hedging.
Enterprise Application Integration: MuleSoft Platform, EIP, EAA
Cloud Platforms/Technologies: AWS, Azure, Azure Databricks, Azure Data Factories
SDLC Methodologies: Agile-Scrum, Linear Sequential Model, Waterfall Model
Build Tools: Ant, Maven, Gradle
Editors / IDE Tools: Eclipse, NetBeans, Adobe Dreamweaver, Notepad++, Sublime Text, Visual Studio, Visual Code, IntelliJ, PyCharm
Source Control Tools: StarTeam, Git, CVS, Subversion
Database Tools: Toad, SQL Developer, SQL Server Management Studio, Studio 3T
Publishing Tools: MS Office.
Operating Systems: Windows, Unix/ Linux.
PROFESSIONAL EXPERIENCE
Confidential
Lead Java Developer
Responsibilities:
- Having complete ownership of GI Margin Tool Production support including triaging the service now incidents and analyzing the incidents and assigning the incidents to the Team Members and designing, developing, and managing work and coordinating with Clients and various stockholders on requirements gathering.
- Preparing Technical Specification docs including system architecture, designing and implementation details for all requirements. And providing the solutions to the new requirements and discussing the requirements with Business Product Owner. And make sure to develop test case notebooks for each requirement.
- Designed and Developed Trade Data Adapters to pull the data from multiple trading systems based on the real time notifications using Spring, Hibernate and Java 8 and Rabbit MQ.
- Worked on Mule ESB and developed Mule Flows to connect to Mule connectors like MongoDB, HTTP, HTTPS, JMS, and Rabbit MQ as a part of integration usage and created Mule Flows, Sub Flows, Exception strategy, Data Weave transformations, Data Mapper and other activities.
- Assessment of existing on-prem system, discovering existing ETL process, Databases and reporting systems for analyzing the system. Design and Develop migration plan for the existing dataset components from on premises to Azure Data Warehouse and ETL process.
- Design and deploy Azure Databricks clusters and gauge the appropriate workload for cluster sizing. Define the heavy lifting process and move to Azure Databricks Notebooks (ETL) and Optimize the data loads in Azure Databricks.
- Refactoring and Rebuilding existing ETL process to harness the Azure Cloud MPP model and Design and Develop Azure Data Factories Pipeline Framework to make sure Logging, Notifications and Error handling to maintain consistency across the project.
- Design batch processing solutions by using Data Factories and Azure Data bricks. Checking the data throughputs from various source systems and tuning accordingly to meet the window period.
- Maintain balance between cost-based approach and performance of the ETL process. Implement custom logging mechanism in Databricks Notebooks and Choosing correct data storage solution to meet the technical and business requirements.
- Designing of tables in Azure Data warehouse using different Distribution models like Hash, Round Robin and Replications so that Data Skews are avoided. Identify the optimal data ingestion method for a batch processing solution.
Lead Java Developer
Confidential
Responsibilities:
- Had complete ownership of TPT.DataAdapter project including Project Scope and Project Planning developing, architecting and managing entire project and coordinating with Clients and various stockholders on requirements gathering and updating project development status with clients on regular basis and prepared Technical Specification docs including system architecture, designing and implementation details for all requirements.
- Designed and Developed multiple adapters using Spring, Hibernate and Java 8 and Rabbit MQ.
- Worked on Mule ESB, Mule API and Mule Cloudhub and RAML and developed Mule Flows to connect to AWS S3 using Mule AWS S3 Connector and created Mule Flows, Sub Flows, Exception strategy, Data Weave transformation, Data Mapper and other activities.
- Created flows to connect to Mule connectors like Mongo, HTTP, HTTPS, JMS, and Rabbit MQ etc. as a part of integration usage and developed various Adapters using C#, Python languages for The HIVE system.
- Worked on AWS Cloud Technologies Cloud Formation, Compute (EC2), Storage(S3) and scripted various AWS Cloud Formation tasks for automation of AWS resources including, Compute (EC2), Storage(S3).
- Helped Client Architecture team on various AWS Cloud related tasks in sprint when required and having complete ownership on Azure Dev Ops Pipelines creation and managing for all adapters and modules developed for automation of build and release management.
- Had complete ownership on Azure DevOps Boards for creating and managing the Sprint planning and handling the Sprint backlogs for all owned projects and modules.
- Executed the TPT.DataAdapter Project using the Agile Scrum Process Methodology and managed the development teams located across various locations and distributing the work among the teams and coordinating the offshore and onshore teams and assisted development team for developing subcomponents and conducted code reviews to improve the performance of TPT.DataAdapter.