Lead Engineer Resume
SUMMARY
- Seasoned engineer having experience in designing & developing enterprise applications. Experience in implementing all phases of the software life cycle and worked with users at various levels, in a variety of project environments ranging from small to highly complex projects.
- Very Strong Object - oriented concepts with complete software development life cycle (SDLC) experience - Requirements gathering, Conceptual Design, Analysis, Detail design, Development, Mentoring, System and User Acceptance Testing and documentation.
- Extensive hands-on experience in Big Data technologies - Hadoop, Spark, pySpark, Kafka, HDFS, Hive, Impala, Kudu, SQOOP, Cassandra, DynamoDB, HBase.
- End to end designed and architecture various real time analytics solutions using Big Data technologies, including Spark, Kafka.
- Hands-on experience in implementing Micro Services & REST API’s using Spring Boot.
- Hands-on experience with Continuous Integration & Continuous deployment using the tools Jenkins, Git, Maven and Docker & Kubernetes.
- Hands-on experience in AWS, ECS, S3, Fargate, Dynamo DB.
- Experience in tuning Hive, Spark and Java applications
- Extensive experience in JVM memory profiling, garbage collection tuning and object referencing.
- Good exposure in SQL, Confidential PL/SQL and accomplished in Testing Frame works using JUnit.
- Application development using Agile/Scrum and Test-Driven Development (TDD) methodologies
- Developed enterprise applications using Java EE, Spring Technologies - Integration, XD, Data, Gemfire, Hibernate, JMS, MQ Series, Ajax, JQuery, AngularJS, Java Script, JSF, Struts, JSP, JavaScript, HTML, DHTML, CSS, JBoss, Weblogic, Confidential App Server, Confidential WebSphere App Server
- Have over 6 years in Financial Domain with hands on experience working with Middle & Back Office Operations, Compliance, Legal, Product Control, and Finance groups.
- Experience in design and implementation scalable solutions using caching technologies like Gemfire, Confidential Coherence
- Good at performance tuning and developing high throughput backend applications using messaging
- Excellent client handling and team co-ordination skills, leading teams with Agile Development practices using Scrum, TDD, BDD and Extreme Programming.
- Excellent interpersonal skills, including the ability to coordinate and motivate team members.
- Have fair understanding in Azure Data Lake, Azure Data Bricks and GCP.
TECHNICAL SKILLS
Cloud Platform: AWS, Azure
Messaging: Kafka, MQ Series
DevOps Tools: Docker, Kubernetes, Ansible, Jenkins, SonarQube
Language: Java, Python, Spark, Scala, Spring, Spring Boot
File Formats: Parquet, Avro, Protobuf, XM, JSON, Csv
Database: Confidential, PostgreSQL, SQL, Impala, Hive, HDFS, Elastic Search, Influx DB
NoSQL: Gemfire, DynamoDB, MongoDB
Domain: Finance - Back Office, Health care and Engineering
PROFESSIONAL EXPERIENCE
Confidential
Lead Engineer
Responsibilities:
- Designed & Implemented pipeline to ingest data from ERP source systems using Kafka
- Ingested has been curated using Spark to bring into common data layer from multiple source systems
- Demonstrated product and delivery experience across all tiers in solutioning of information management solutions - from data sourcing to end state delivery / consumption
- Designed & Implemented Micro Services to deliver/publish data to consumers
- Implemented Spring Boot Micro Services to process the messages to and from the Kafka cluster.
- Implemented HA availability solution across various micro services using Zookeeper.
- DaaS - Data is made available to consumers through API Gateway (REST API’s) using Spring Boot & AWS API Gateway
- Integrating with OAuth for providing secured access to API’s.
- Responsible to create various Cloud Watch alarms that sends an Amazon Simple Notification Service (SNS) message when the alarm triggers
- Created S3 bucket for backups using versioning enable and moved objects to Amazon Glacier for archiving purpose and using Amazon S3 for database backup regularly and save snapshots of data.
- Designed Dynamo DB and event processing using Lambda function
Environment: Hadoop, Hive, Impala, pySpark, Parquet, Java, Spring Boot, Spring REST, XML/JSON, Maven, Bit Bucket, Kafka, SonarQube, Cucumber, Jenkins, Junit, SAP, Salesforce, OAuth, AWS, S3, Fargate, ECS, API Gateway, DynamoDB
Confidential
Architect/Lead
Responsibilities:
- Built an enterprise-wide data platform ingesting and aggregating data from various data sources using Spark, HDFS
- Designed and implemented Micro Services - adjustment, reconciliation and validation services.
- Implemented REST Micro Services using Spring Boot. Generated Metrics with method level granularity and Persistence using Spring Actuator.
- Using Spark stream consumed messages from Source systems and ingested into Data Hub.
- DaaS - Data is made available to consumers through Apigee Gateway.
- Implemented HA availability solution across various micro services using Zookeeper.
- Implemented CI/CD pipeline using Jenkins, Git, Docker
Environment: Hadoop, HDFS, Spark Streams, Java, Spring, Spring Cloud, Spring Boot, Kafka, Zookeeper, Mongo DB, Elastic Search, Docker, Ansible, Fluentd, Mirus Replicator, Maven, Jenkins, Git, SonarQube
Confidential
Sr. Java Developer
Responsibilities:
- Managed cross-border teams to deliver solution globally
- Acted as a liaison with leadership, project teams, QA and compliance to support integrations.
- Worked as a technical Architect - Enterprise Data Grid for integrations with consuming applications.
- Ingested master data and transactional data from various ERP systems.
- Harmonize/curate data from 20 ERP systems (SAP, JD Edwards and BPICS) into one standard enterprise model.
- DaaS - Data is made available to consumers through Apigee Gateway.
- Using statistical forecast models to do demand, production planning - sent results to ERP’s.
- Implemented CI/CD pipeline.
Environment: Java, Spring, Spring XD, Spring REST, Gemfire, Kafka, Maven, Bit Bucket, JIRA, JDE, SAP, Azure, Apigee, XML/JSON, Jenkins, Git, SonarQube, Cucumber, Jenkins, Junit
Confidential
Sr. Java Developer
Responsibilities:
- Designed ETL process to get trade, position, finance, account, balances and journals.
- Designed/Developed the server db model using Spring Gemfire.
- Loading batch data into Gemfire through Spring XD and updates from UI through Spring Data
- Developed a Web service using JSON Restful Web services for sending/consuming relationship information from Confidential CRM systems.
- Stress on Continuous Integration with tools using SVN, Jenkins, Maven, Shell Script, Auto Deployment to Serves using post release, Image Preparation using Docker, nexus repository, Monitoring tools like Nagios & ELK stack
- End to End testing of new trading products - Commercial Papers and MBS trading.
- Designed & Developed service layer for users/process to access data/applications
- Designed and developed web API’s.
- Built centralized logging system with Log stash/ElasticSearch/Kibana.
- Authorizations are cached based on user using Spring Gemfire
- Participated in all phases of the development life cycle, functional & technical design, development, unit testing, technical documentation, usage documentation and maintenance.
- Led "Continuous Delivery" project, streamlining Dev workflow, integrating automated QE validation, and delivering standardized releases (based on Docker) to Operations for deployment.
- Designed & Developed process for creating accounts in external systems like BPS, ICI, Gloss & Internal Systems through JMS & MQ using Enterprise Integration patterns
- Designed workflow process for creating counterparty & trading accounts information starting from Sales to accounting opening and notifying all groups/applications/integrating all systems.
- Process for maintenance of Counterparty KYC, Legal Entity Information & eligibility documents & to keep them upto date.
- Developed User Interactive Dashboard using Angular JS which is tightly integrated to the Application Server model.
- Integration with Confidential Alert system for global SSI’s using XML and event processing for Alert changes.
Environment: Spring, Spring Boot, Restful API, JQuery, JBoss 7, K2, Spring Data, Apache POI, Jenkins, Git, Docker, Maven, MQ, JMS, Eclipse, Perl Confidential, SQL, PL/SQL, Confidential warehouse Builder and SSIS.
Confidential, NY
Sr. Java Developer
Responsibilities:
- Liaised and coordinated in organization with key business owners, executives, and subject matter experts to explore and redesign key business processes
- ADF pages are developed to do real time posting of selective vacancies to various job sites using MQ.
- Defined and executed business workflows using JBoss - Business Process Management (JBPM).
- Involved in the Workflow development and Customization including escalation of notifications using workflow, redirection of Approvals using AME and workflow.
- Analyzing the Reporting requirements and drawing specifications requirement for the same.
Environment: JDeveloper, Java, Confidential ADF, JBPM, MQ Messaging, Workflow, AME, HR, AP, GL, Confidential 9i, SQL, PL/SQL, Toad, SQL Loader