- A dynamic and goal - oriented Solution Architect,Technology Lead with over 17 years of solid experience in requirement gathering, estimation, analysis, design, development and testing of web-based portal applications using Salesforce CRM, Java Platform Standard Edition/J2EE,Big Data Hadoop,Cloud Infrastructure, Amazon Web Services,Mule Soft ESB,Sailpoint IIQ.
- Experience leading Agile development teams in providing consistent on-time, low-defect rate, deliveries.
- Able to work with a team of accomplished developers to deliver a robust, high-performance set of enterprise integrations services.
- Interface with internal QA and PM teams to coordinate releases of thoroughly tested new and re-engineered components into production.
- Excellent understanding/knowledge of Hadoop Ecosystem including HDFS, MapReduce, Hive, Pig, Storm, Kafka, YARN, HBase, Oozie, ZooKeeper, Flume and Sqoop based Big Data Platforms.
- Expertise in design and implementation of Big Data solutions in Banking, Retail and E-commerce domains
- Experience in creating various Use case diagrams, Class diagrams and Sequence diagrams and using Microsoft Visio, Rational Rose during the design phase of SDLC
- Experience with multiple databases like Oracle10g Express, MySQL and experience in writing Stored Procedures using PL/SQL,Mongo Database.
- Experience in using IBM Web Sphere Application Server 7.0 (WAS), IBM Web Sphere Portal Server 8.0, Tomcat Web Server 6.0, JBoss Application Server 3.2.7,BEA Weblogic 9.0/10.0, Oracle WebLogic Server 11g (10.3.6),IBM PIM,IBM Datapower,Service Oriented Architecture SOA,Restful webservices,Soap webservices.
- Experience in using MVC Frameworks Spring, Spring Batch,JSF,Portlets,Angular.js,Node.js,React.js
- Hands-on experience with ETL (Extract-Transform-Load) tools
- Good Understanding of Middleware Tibco,Datapower,Dot Net
- Mulesoft and Salesforce Integration Architect-LPL Financials
- Design and development of Bidirectional Sync between Salesforce CRM and Client Works database.
- Design and development of Batch Processing of Data from SalesForce to Client Works database, Client Works database to SalesForce Database.
- Design and development of invoking the Dotnet Rest Create Client Id Service from Mule ESB using the Transformers and set Payload.
- Designed the Mule integration between the LPL Clientworks and Red Tail CRM for data syncing of Prospect, Client and Contacts.
- Designed the Mule integration between the LPL Clientworks and Microsoft Dynamics CRM for data syncing of Prospect, Client and Contacts.
- Installed the Salesforce bi directional package for the CRM User bi di flag enabled.
- Configured the page layout for Salesforce Clients Object Object,Contact Object.
- Configured the Azure Queue Bus in Salesforce.
- Developed the Salesforce SOQL queries to get the bi di access flag of the User.
- Acted as a Saleforce Admin for creating users in dev org,int org and prod org.
Senior Software Architect
- Worked on Mule ESB Integration Design to fetch the Yale Workday Data into the Yale Master Data Repository.
- Worked on the Design of the Net Id Management System in Spring and Java1.8.
- Was involved in the Design of the Identity Access Management Systems using Sailpoint Identity IQ.
- Was involved in the design of Talend ETL Jobs for switching the databases.
Confidential, Charlotte, NC
- Responsible for Design and Development of Scorecard System for Obligors using Java Technologies.
- Responsible for Design of the Score Card Database Design.
- Providing technical leadership to other software developers. Specify, design and implement modest changes to existing software architecture to meet changing needs.
- Proposed, implemented and maintaining standards for administration, maintenance, and support of Salesforce, Contact Center, and Billing platforms.
- Collaborate with clients and business analysts to translate business requirements into technical requirements.
- Lead the development of VSIMS System for property appraisers.
- Lead the development of Batch Processing using Spring Batch.
- Responsible for loading Agricultural customer's Confidential warehouse data into HBase.
- Created HBase tables to store variable data formats of input data coming from different portfolios.
- Installed and configured Hadoop Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data preprocessing and loading into HDFS.
- Migrated from CLOUDERA CDH 4 to CDH 5 distribution.
- Involved in preparing adhoc report as per end users requirement by running Hive queries .
- Implemented Sparkadvanced procedures liketext analytics and processingusing thein-memorycomputing capabilities.
- Implemented the Apache Yarn Configuration for for managing applications in a distributed manner.
- Installed and configured SailPoint Identity IQ Systems for Confidential .
- Designed the workflows in Sailpoint for Joiner,Movers,Leavers at Confidential .
- Installed the Azure Active Directory in Cloud.
- Used the REST calls to activate the the User In Azure AD,to reset the password in Azure AD.
- Integrated the Azure AD with Sailpoint IIQ.
- Designed the Talend ETL Job to get the data from workday database to Enterprise Identity Database.
- Designed the Mule Soft Integration flows to integrate the Score Card System and Property Appraiser System sending the credit score details to Property Appraiser System using Mule Soft Queues.
- Designed the Mule Soft Integration flows to generate the report in Property Appraiser System from multiple data sources.
Environment: Java 1.7, Eclipse, Maven Plugins, Ajax-JQuery, Spring Framework, Hibernate 5.0,HTML 5,CSS3,JSTL Tags, Teradata,Oracle DB,PL/SQL,Apache Hadoop Big Data,Map reduce, Hbase,ETL,Sail Point IIQ,Talend ETL,Apache Yarn,Apache Spark,MuleSoft Studio 6.4
Confidential, Charlotte, NC
- Developed technical solutions to build Java architectural framework for Pim Portal.
- Managed architect team to develop architectural components and services.
- Worked with management to determine cost estimates.
- Provided support in developing project roadmaps.
- Enforced best practices and utilized code quality tools.
- Saved our firm thousands of dollars and man hours by automating almost 80% of manual processes using declarative (Workflows, Processes etc.) and programming (created 20 Apex triggers/classes and 12 Visualforce pages) tools
- Setting up our Salesforce org and users, Identifying the steps to set up and maintaining a user .
- Setting up role hierarchies, profiles, permission sets to grant appropriate permissions and create sharing models.
- Importing, exporting and migrating data to/from Salesforce.
- Automating manual processes using workflows, visual workflows, approval processes, Apex and Visualforce.
- Creating validation rules, formulas to solve business requirements.
- Determining the appropriate reporting/analytic tool to meet complex reporting requirements.
- Used dynatrace for memory use and garbage collection activity to enhance the performance of the application
- Designed the Pet Locking Feature in multi threaded environment
- Designed and developed the server side logic to get the data using xquery from the IBM MDM data set and display on the worklist pet display portlet .
- Designed and developed the Content Pet Portlet which is a data entry screen to enter the Item Primary Hierarchy Attributes and Blue Martini Attributes of an Item
- Designed and Developed the json rest ful webservice to save the data in IBM PIM database, to approve the content pet in IBM PIM database..
- Designed and Developed the json rest ful webservice to do the IPH Mapping for an Item in IBM PIM database .
- Designed simple and complex MapReduce programs in Java for Data Analysis on different data formats
- Successfully migrated Legacy application to Big Data application using Hive/Pig/HBase in Production level
- Developed MapReduce pipeline for feature extraction and tested the modules using MRUnit
- Optimized MapReduce jobs to use HDFS efficiently by using various compression mechanisms
- Creating Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way
- Responsible for performing extensive data validation using Hive
- Populated HDFS and Cassandra with huge amounts of data using Apache Kafka
- Designed the integration workflows for integrating the PIM database and Content Acquisition Systems using the Mule ESB framework.
- Extensively used Mule components, which includes File, SMTP, FTP, SFTP, JDBC Connector, and Transaction Manager.
- Experience in Mule Soft Anypoint API platform on designing and implementing Mule APIs.
- Responsible to develop RESTful/SOAP web services in Mule ESB based on SOA architecture.
- Created MULE ESB artifact and configured the MULE configurations files and deployed.
- Deployed Mule ESB applications into MMC (Mule Management Console).
Environment: Java 1.7, RAD 8.0, IBM Websphere Portal Server 8.5 Express, JAutoDoc, Maven Plugins, Ajax-JQuery, Spring portlets 286,Spring Framework 4.0,Hibernate 5.0,HTML 5,CSS3,JSTL Tags, IBM MDM-PIM,Apache Hadoop Big Data 2.7,Mule ESB
- Preparing the Project Plan, Doing Effort Estimation for development of software components, Mentoring and grooming team members.
- Providing the technical vision and guidance to the organization.
- Design of Pistol Permit Re Website and Pistol Permit Re Portlet domain objects using RSA.
- Design of Pistol Permit Processing Processing Portlet.
- Development of Pistol Permit Re System using JSF
- Developed the back end logic for Pistol Permit Re Portlet using Object-Relational Mapping Framework JPA1.2.
- Development of Pistol Permit Processing System
- Designed and developed the Soap webservice for DMV.
- Coding shell scripts to delete the ninety days old pistol permit data from the database using jdbc.
- Coding batch scripts to delete the ninety days old pistol permit data from the database using jdbc.
- Designed and developed the Review Letter PDF Generation Servlet.
- Developed audit framework to audit the request and response objects
- Developed the Assault Weapon Registration using Spring 3.1 deployed on Oracle WebLogic Application Server 11g (10.3.6)
- Coded Named Query to search the pistol permit from the database
- Developed the Audit Interceptor.
- Developed the Activity Log Interceptor.
- Developed the JUnit Tests and incorporated the Test Driven Development.
- Developed the Pistol Permit Search Engine using Apache Lucene
Confidential, Tampa, Florida
Senior Software Engineer
- Setting Up of the Environment for Portal Development including Maven, RAD and IBM Portal Websphere Server installation and configuration.
- Designed and developed the GUI for Notification Portlets using HTML, IJ Tag library, Ajax-Jquery and JSF Framework
- Developed the Page Code and Master Data Delegate Component
- Designed and developed the Data Bean and Request Bean, Service Name Bean for notification Portlets
- Designed and developed the servlet to to invoke the business service layer.
- Developed the Assembler for Notification Portlets.
- Configured the web.xml, faces-config.xml and Portlet.xml
- Developed the Women Health Facility Notification Portlet using JSF components.
- Developed the Highway Drug Interdiction Portlet.
- Developed the Weather Update Portlet for NSYP Troops
- Developed the wsdl for Inter Portlet Communication
- Developed the Hit Confirmation Request and Hit Confirmation Response Portlet using Inter Portlet Communication
- Developed the Inbox Stylesheets for the Notification Portlets
- Developed the Ajax JQuery methods for hiding and showing the Portlets fields, using the blind/trigger feature of Jquery, generating the dynamic header section on change of Drop Downs using Jquery
- Troubleshoot the request and response xml generated from the portal application and Business Service Layer
- Refactored the Notification Assemblers.
- Effectively used Maven for automated builds in Windows Environment
- Deployed the Notification Portlets In Portal Server in Sandbox Environment
- Setting Up of the Environment for Business Service Development including Maven, RAD and IBM Websphere
- Application Server installation and configuration.
- Configured WAS 6.1 for Queue Connection Factory/Queue Managers/JMS Queues/ Websphere MQs/ Channels
- Configured TBNotificationProcessor MDB with Queue and IJPInboxMDB with Inbox Queue using Activation Specification
- Configured WAS JVM properties and shared libraries
- Configured datasources on WAS
- Developed the xslt for transforming the Coarse Grain Request to Fine Grain Request
- Designed the schema for DMV Photo Request xml and Response xml
- Developed the xslt for transforming the niem xml to soap xml and vice versa
- Configured WAS Server for calling soap over https
- Have built interoperable soap IJPRegistrationDomain web-service with JSR 109 which is based upon JAXRPC so the datapower can communicate with the business services deployed in WAS using SOAP apis
- Developed client side api to make a call to the dmv photo webservice deployed in Datapower X150 from
- Business Service deployed in WAS using SOAP apis
- Extensive use of RFHUtil for analysing the message content present in the Queue
- Developed NYAlert Business Logic and also developed NYAlert Message Driven Bean
Environment: Java 1.6, RAD 7.5, IBM Websphere Portal Server 184.108.40.206, Single Sign On -LDAP, Mylyn Pugins, JAutoDoc, Maven Plugins, Ajax-JQuery, Hudson Server, Apache-maven-2.2.1, HTML, Mozilla FireBug, Junit, Selenium IDE, Hibernate3.0, Tortoise SVN, Oracle ADF 220.127.116.11, Oracle BPM, Windows XP, Oracle 10.2.0 Express, Toad, Nexus Repository, JSF, UNIX, Altova Xml Spy 2011, IBM Websphere Application Server6.1/7.0, XSLT2.0, IBM Webpshere Broker 18.104.22.168, IBM Websphere MQs 22.214.171.124, Broker Toolkit, RFHUtil, Message Driven Beans, JMS, Soap Webservices-JSR109, Soap UI 3.5, IBM Datapower X150, Niem XML,TDD, Soap XML, XSD,Servlet,JDBC, Oracle WebLogic Application Server 11g