Senior Software Developer Resume
Southlake, TX
SUMMARY
- Over 14 years of experience in delivering enterprise solutions in medium and large multi - location JAVA/J2EE projects.
- In depth exposure to OOAD architecture, modeling and designing artifacts to satisfy peculiar and complex business problem statements.
- Expertise in evaluating, selecting and integrating innovative technologies.
- Saved $2 million for Client by re-modeling expensive middleware written in C to Java.
- Preceded the use of tools / plugins that helped the team in achieving higher productivity / Velocity.
- Proven history of identifying project risks and implementing remediation strategies to ensure project success.
- Strong interpersonal skills and ability to project manage and work with cross-functional teams
- Strong experience in class diagram, sequence diagram, component diagrams and activation diagrams.
- Strong Programming Skills in designing and implementation of multi-tier applications using Java, J2EE, JDBC, JSP, JSTL, HTML, JSF, Struts, Spring Boot, JavaScript, Servlets, JavaBeans, CSS, EJB, XSLT, JAXB.
- Specialized in frameworks like Struts, Spring, Spring batch and Spring Boot.
- Extensive experience in developing Microservices using Spring Boot
- Experience in developing Web Services - JSON, SOAP, WSDL, REST.
- Developed DTDs, XSD schemas for XML (parsing, processing, and design).
- Experience in Integration testing tools like JMETER.
- Experience in Performance testing tools like APPDynamics and Load Runner
- Experience in implementing Java EE design patterns such as MVC, Singleton, Session Facade, DAO, DTO, and Business Delegate in the development of Multi-Tier distributed Enterprise Applications.
- Wide experience in analyzing, designing and developing critical systems, experienced in web enabling of client/server applications.
- Have strong knowledge in TDD (Mockito, Spy, JunitParams etc.) and BDD (Mostly done SbE for Interaction with QA and BA teams)
- Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management
- Experienced Hadoop Developer with strong knowledge of FDS and MR frameworks.
- Working experience in the following Big data frameworks/Tools:
- Apache Pulsar - Experience in topic and pulsar operations management
- Kafka - Experience in writing Producer and consumer layers
- Flume - Configured multiple agents in Cluster. Worked on multiple source (MQ, Kafka etc.) and sink (HDFS, MQ, Kafka) configurations
- Pig - Experience in writing pig scripts for compressing data in HDFS
- Hive - Created Hive tables from HDFS Json format
- HBase - Created HBASE tables from HDFS
- Sqoop - Imported data from Oracle DB to HDFS
- Talend – Create Talend jobs to read from Hive, load Oracle tables and FTP files to Mainframe server.
TECHNICAL SKILLS
Programming Languages: Java, Scala, C, VB, HTML, JavaScript, Korn Shell Script
Technologies & Frameworks: Apache Hadoop, Apache HBase, Apache Kafka, Apache Pulsar, Pig, HBase, Hive, Apache Flume, Rabbit MQ, Azkaban, Redis, JSP, Servlets, EJB 2.0, Struts 1.1, Spring 3.0.7, Spring Batch, OJB, JMS, ehCache, Apache Quartz, PL/SQL, Axis2, JQuery, Hibernate
Databases: MongoDB, Oracle, DB2
Operating System: UNIX, DOS, Windows 2003
Tools: Ant, Maven, VSS, CVS, SVN, TFS, Rational Rose, Amateurs UML, Microsoft Visio, PuTTY, WinSCP, Talend
Testing Tools: Load Runner, JUnit, JProbe, JProfiler, JMeter, App Dynamics, SOAP UI
Methodologies: Agile, Waterfall
IDE: RAD6.x/7.x, Eclipse 3.x, Dreamweaver, Toad
Agile Tools: Rally, Version One, JIRA
Application & Web servers: WAS 7.x, JBoss, Tomcat
PROFESSIONAL EXPERIENCE
Senior Software Developer
Confidential – Southlake, TX
Responsibilities:
- Actively participated in design and requirements discussion (always ready to white board)
- Designed and developed key automation module which was the core component interacting with Apache Pulsar
- Migrated the whole Kafka environment to Apache Pulsar in AWS cloud platform.
- Designed and developed Spring Boot micro service to collect metrics from Apache Pulsar
- Developed backend components to support the users to onboard into Pulsar
- Integrated application with visualization / monitoring tools like Grafana and Prometheus
- Integrated with Filebid and Logstash for log monitoring
- Develop Spring Boot micro services and REST APIs to centralize portal and facilitate seamless Producer / Consumer onboarding and automate application management
- Developed server-side application to interact with database using Spring Boot and Hibernate.
- Troubleshoot and implement seamless Producer / Consumer onboarding and manage data from multiple VZ applications
- Monitor health metrics and key operations metrics related to Producer / Consumer applications
- Used Postman and Swagger to test the RESTful API for HTTP requests such as GET, POST, and PUT
- Implemented Log4j for a broader perspective of the project.
- Developed test classes in JUnit for unit testing.
- Used GIT for version control tool for merging branches and to solve conflicts
- Ensure data governance and data streaming / aggregation from applications, post review with security / legal teams
Environment: Java 8, Spring Boot, Micro Services, Apache Pulsar, Kafka, Grafana, Prometheus, Filebid, Logstash, AWS, Jenkins, Git
Big Data Developer
Confidential
Responsibilities:
- Analyzed the requirements and participated in HL and LL design discussions
- Involved in capturing the business requirements, design, development and testing of the application
- Configure CICD for project – Code management using GIT, Code migration using Jenkins
- Designed and developed multiple jobs / processes to ingest, consume, transform and integrate data between pipelines.
- Developed scheduling events and processes using ZENA.
- Developed ETL framework using Shell Script, Python and Hive (including daily runs, error handling, and logging) to glean useful data and improve vendor negotiations
- Conduct in-depth research on Hive to analyze partitioned and bucketed data
- Develop HQL to read, insert using multiple tables
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Designed and developed Spring batch Processing with remote chunking
- Remote chunking involved the Controller layer to be deployed in one node and interact with Agent deployed in different worker nodes.
- Configured and wrote scripts to invoke the controller and agents’ jobs in different nodes
- Configured spring configurations xmls for controller and agents
- Developed SQL query for implementing Item Reader
- Converted the Java objects to JSON for further processing.
- Designed and developed Kafka Producer layer
- Designed Fail back table, which was to store the data which were not successfully written to downstream
- Used Spring Batch Job repository tables to store the store job related metrics.
- Developed flow for Daily and Historic job flow, created steps in spring batch which could be reused within jobs.
- Suggested and implemented usage of property files and enums which reduced lines of code and improved maintainability.
- Verified MQ messages through MQ Visual Browser
- Configured Flume in Hadoop cluster with source as Kerberos Kafka and sink as HDFS
- Created Hive tables to store Json format using serde
- Created Pig script to compress data (snappy) in HDFS
- Configured usage of Jaas configuration to produce/consume message from Kerberized Kafka
- Optimized flume configuration to process 1M records within 15 mins
- Wrote JUnit test cases with Mockito and Spy
Environment: Java 8, MyEclipse, Spring batch, JMS MQ, Apache Kafka, Apache Flume, Hadoop
Lead Java Developer
Confidential
Responsibilities:
- Analyzed the requirements and participated in HL and LL design discussions
- Involved in capturing the business requirements, design, development and testing of the application
- Designed and developed presentation layer which includes the development of standards-browser compliant using OpenLAZLO, JSP and HTML
- Designed and developed business layer which includes Spring configurations xml, action layer, DTO and service layer
- Designed and developed database layer which includes hibernate configuration xml, hibernate entity class and DAO classes
- Worked on optimization and performance tuning of critical aircraft re numbering flow which was causing dead lock issues. Reduced the time taken end to end from 2 minutes to 4 seconds.
- Used App dynamics tool to monitor and record performance of system
- Developed Message receiving framework to accept JMS messages from queue.
- Worked on bug fixing and enhancements on change requests
- Used Spring Web MVC framework for the applying MVC Implementation to the web application.
- Used Spring ORM to integrate Spring Framework with hibernate and JPA.
- Used Spring AOP module to handle transaction management services for objects in any Spring-based application.
- In TDD apart from Mockito and Spy used JUnit Params
Environment: Eclipse Kepler, Oracle, JMS, Spring, Lazlo, Hibernate, Jenkins, JMeter, AppDynamics, DynaTrace, Maven, SQL Developer, Tomcat, WAS 7, OpenLazlo