Sr Mulesoft Integration Developer Resume
Salisbury, NC
SUMMARY:
- 10years of extensive experience in the complete Software Development Life Cycle (SDLC) including Requirements gathering, System Analysis, Designing, Coding, Testing, Deployment and Maintenance of software applications.
- Around 4+ years of experience on Mulesoft ESB Enterprise edition. Had an experience working on Mulesoft API manager and RAML.
- Around 6+ Years of experience on IBM Mainframe applications using COBOL, JCL/Procs, VSAM, DB2, IMS DB/DC, CICS, DB2 Stored Procedures, TSO/ISPF, MQ Series, TELON, REXX, File - Aid & File Manager, Xpeditor, Endeavor/Changeman, Debug Facility, CA7 Scheduler, IBM utilities (IEBGENER, IEFBR14, SYNCSORT), DFSORT, ICETOOL, SPUFI and QMF.
- Experience with implementing service-oriented architecture (SOA) and enterprise application integration (EAI) using Mulesoft.
- Experience in analysis, design, development, testing, deployment and monitoring of EAI & ESB projects.
- Experience on creating Flows, Sub Flows, Exception strategy, Data Weave transformation, and other activities.
- Experience using mule connectors like DB, HTTP, HTTPS, FTP, FILE, SFTP, JMS, SalesForce etc as a part of integration usage.
- Experienced in Cutting Edge Technologies like MULE ESB, ActiveMQ, XML and Log4j with good experience in Developing and Deploying application on CloudHub.
- Integrated different systems that include File Upload, Data Base, Web services and Spring services etc. using Mule ESB.
- Have good exposure on MUnit test cases to validate Mule Flows.
- Experience in implementing Mulesoft Batch Processing and Scatter-Gather.
- Experienced in designing and testing custom API using RAML.
- Knowledge of SOA Design patterns for building middleware systems ground up using Message Routing, Message Filtering, Message Transformation, Batch message processing and error handling.
- Ability to write complex SQL queries with strong knowledge of DB2.
- In-depth experience in the design and development of online (CICS-DB2)/ (IMS-DB2) and batch (COBOL-DB2) database systems.
- Proficient in working with both IMS and DB2 systems.
- Experience in Retail, Manufacturing, Banking (Credit Card) domains.
- Good working knowledge on SPUFI and QMF for pre-testing the SQL statements. Received appreciation from senior management for tuning SQL queries.
TECHNICAL SKILLS:
Operating Systems: OS/390, MVS/ESA and Windows
Programming Languages: COBOL, JCL/Procs, SQL, CICS, MQ, DB2 Stored Procedures, REXX, TELON, Java
Middleware : Mule ESB, Active MQ. Amazon SQS, WSO2 API Manager, Salesforce
Web services : WSDL, SOAP, REST, JAX-WS, JAX-RS, UDDI, AXIS, CXF, JERSEY
Web Technologies : HTML 5, Java Script, AJAX, CSS 3, Applets/Swings, JSTL, JSON, JQuery
Databases: DB2, IMS DB/DC, VSAM
Tools & Utilities: TSO/ISPF, ENDEVOR, IBM Debug, CHANGEMAN, XPEDITOR, SORT, IDCAMS, FILEAID, File Manager, QMF, NDM, REXX, SPUFI, FTP/SFTP and CA7 SCHEDULER, HP ALM, Quality Centre, JIRA
Development Tools : Eclipse, IBM MQ Series, SQL Developer, TOAD, ANT, MAVEN, Edit Plus, Any Point Studio, Jenkins.
PROFESSIONAL EXPERIENCE:
Confidential - Salisbury, NC
Sr Mulesoft Integration Developer
Responsibilities:
- Worked and developed applications using Mule ESB 4 and Anypoint studio 7.
- Developed mule flows using various connectors like HTTP, IBM MQ, X12 EDI, Object Store, Until Successful etc
- Worked on Mule Domain project to use the connector configurations like HTTPS and IBMMQ. Also used Secure properties config, TLS context for Trust store and JKS cert for secure connection and data transfer
- Worked with Mulesoft Anypoint API Platform on designing the RAML for implementing REST API's and used API Gateway as a proxy service.
- Implemented JWT token over HTTPS in mule flows for secure information transfer between two RESTful Services.
- Configured and developed property files in YAML based on different environments like DEV, SIT, UAT, etc
- Worked on Dataweave 2.0 for transformation of data using functions like map, map object, split, replace, group by, etc
- Worked on different data transformations like EDI to JSON, Copybook to JSON and vice versa.
- Extensively used metadata driven development in transformations and schemas (like Flatfile, copybook, Json, XML) in mule 4.
- Implemented JMS using IBM MQ to send and publish payload and error data to the Queues.
- Deployed APIs to the cloud hub using Anypoint platform
- Implemented various exception handling scenarios globally using Try, Error Handler, On Error continue and On Error Propagate scopes.
- Used Maven and Jenkins for continuous integration and deployments.
- Configured VM Queue to increase the performance of application.
- Experienced in application deployment on cloud hub and managed the app using Mule MMC.
- Configured and managed Mule MMC.
- Created MUnit Test Cases and Mocked the test data for validating the positive and negative scenarios.
- Worked on postman and SoapUI for testing the requests and responses for both SOAP and REST.
- Used Git as a source code repository and version control tool
Environment: Mule ESB 4.1.3, Anypoint Studio 7.2.3, Data Weave 2.0, Cloud Hub, IBM MQ, X12 EDI, Copybooks, JSON, XML, Anypoint platform, Soap, REST full APIs, RFHUtil, Elk, Maven, Jenkins, Git, Postman, SOAPUI, RAML.
Confidential - Chicago, IL
Mulesoft Integration Developer
Responsibilities:
- Worked and developed applications using Mule ESB 4 and Anypoint studio 7.
- Developed mule flows using various connectors like HTTP, IBM MQ, X12 EDI, Object Store, Until Successful etc
- Worked on Mule Domain project to use the connector configurations like HTTPS and IBMMQ. Also used Secure properties config, TLS context for Trust store and JKS cert for secure connection and data transfer
- Worked with Mulesoft Anypoint API Platform on designing the RAML for implementing REST API's and used API Gateway as a proxy service.
- Implemented JWT token over HTTPS in mule flows for secure information transfer between two RESTful Services.
- Configured and developed property files in YAML based on different environments like DEV, SIT, UAT, etc
- Worked on Dataweave 2.0 for transformation of data using functions like map, map object, split, replace, group by, etc
- Worked on different data transformations like EDI to JSON, Copybook to JSON and vice versa.
- Worked on transformation of EDI(Electronic Data Interchange) data to send health related information to RESTful and SOAP services.
- Implemented 5010X217 HIPAA validation using X12 EDI connector for 278 EDI requests and responses and generated AAA responses on errors.
- Extensively used metadata driven development in transformations and schemas (like Flatfile, copybook, Json, XML) in mule 4.
- Implemented JMS using IBM MQ to send and publish payload and error data to the Queues.
- Deployed APIs to the cloud hub using Anypoint platform
- Implemented various exception handling scenarios globally using Try, Error Handler, On Error continue and On Error Propagate scopes.
- Used Maven and Jenkins for continuous integration and deployments.
- Worked on postman and SoapUI for testing the requests and responses for both SOAP and REST.
- Used Git as a source code repository and version control tool
Environment: Mule ESB 4.1.3, Anypoint Studio 7.2.3, Data Weave 2.0, Cloud Hub, IBM MQ, X12 EDI, Copybooks, JSON, XML, Anypoint platform, Soap, REST full APIs, RFHUtil, Elk, Maven, Jenkins, Git, Postman, SOAPUI, RAML.
Confidential - Reston, VA
Mulesoft Developer
Responsibilities:
- Assisting on-going interfaces delivery process for new and existing projects and working on developing integration interfaces.
- Perform applications integration using Enterprise Service Bus (ESB) and other middleware technologies.
- Middleware application deployment following change management process.
- Actively involved with requirement understanding and analysis.
- Design and develop ESB/SOA/BPM middleware layers using Mulesoft ESB.
- Interact with other technical team leaders such as architects, testing, analysis and release managers as a lead member of the development team in order to accomplish the business goal.
- Implemented Soap Request while posting data to SAPFico.
- Configured the Database Connector and leveraged Select Operation to connect to Sql server and extract the data.
- Configured Choice Routers, Global Exception Handling, Splitters and Aggregators, VM queues, Active MQ.
- Designed and developed RAML files in API Designer and applied API proxies and policies.
- Updated the status of collection of DB records by enabling Bulk Mode.
- Configured the Mule Batch Steps in the Mule Batch process with three phases of Loading, Dispatch and Process for letting know the customer about his pending transactions using the email notifications. Filtered the payload by using Accept Expression and processed the request
- Developed a Status Tracker Service to receive the SAP Document Number along with message from SAPFico for Submitted Goods Receipts.
- Used the flow variables, flow reference, properties, CXF Webservices in the Mule flows.
- Implemented Restful Webservice and invoked the Workday system to receive the Demographic Data as either FullFile or Change File.
- Implemented Json Schema validation to validate Json.
- Implemented Scatter-gather, For-Each, Async scope, Queued-asynchronous-processing strategy, Private flow, Sub flow.
- Worked on Mule Requester to request the data from a folder in the middle of the flow.
- Developed Groovy Script in creating the LDAP request.
- Worked on Add Entry, Modify Entry, LookUp, Delete Entry Operations in LDAP
- Synchronized the Demographic data in to Active Directory in an hourly basis.
- Configured ActiveMQ to receive the updates from SAPFico.
- Implemented Splitter-Aggregator, Web Services Consumer, CXF Webservices, Message Filters, Method Entry Point Resolvers, MEL scripts.
- Implemented GroupBy, filter, map functionalities in Dataweave.
- Partitioned exception handling into 3 types Business Exceptions, Validation Exceptions and System Exceptions using Choice Exception Handling with multiple Catch exceptions which are configured as Global element.
- Implemented Mule Batch Processing to handle large records of Swiss Airlines FlatFile Data.
- Configured VM Queue to increase the performance of application.
- Experienced in application deployment on cloud hub and managed the app using Mule MMC.
- Configured and managed Mule MMC.
- Created MUnit Test Cases and Mocked the test data for validating the positive and negative scenarios.
- Experience in testing SOAP and RestFul webservices using SOAP UI and Postman.
- Configured Amazon SQS to send data to PAC+ systems.
- Deployed and scheduled mule projects.
Environment: Mule Batch, JMS, CXF Webservices, SOAP and Rest Web services, Java, MySQL, MULE ESB, Anypoint Studio, Mule Server 3.8.0, MMC, Jenkins, JIRA, SVN, Java, Spring, Munit, Active MQWorkday, LDAP, Amazon SQS.
Confidential - Dearborn,MI
Mainframe Developer (Software Engineer)
Responsibilities:
- Analysis, Coding, Debugging, Preparing the Technical Specification, Test Conditions, Test Cases and Test Data for the Unit testing, Regression testing, Integration testing, User acceptance testing, Enterprise testing as well as Deployment check outs.
- Development of Mainframe code using COBOL, DB2, JCL, VSAM, IMS DB/DC, SQL, DB2utilities like load, unload, Explain etc., JCL utilities like sort, etc.,
- Coded new DB2 stored procedures for Vehicle suspension notification process.
- Design, Coding and testing of new DB2 stored procedures for VOPS reporting.
- Developed High complex SQL queries to fetch data from database to validate the data for testing.
- Used MQ series Messaging queues to retrieve/Send Messages from local queues utilizing MQ series-put and MQ series get functions.
- Utilized SPUFI to extract data from DB2 tables, validated data per each client requirements. Created batch JCL runs to NDM/FTP Emails and converted Files and results to various Clients.
- Provided estimates for Development Projects, set detailed schedules and expectations, delegated task among team members, monitored overall progress, and provided status report to upper management.
- Interacting with Business and Testing teams for discussions on Business requirement, Functional specification, Release meeting, Deployment meeting and walking through Test artifacts.
- Training the issues and providing an effective resolution.
- Understanding the business & technology of the client system for providing them with quality deliverables.
- Implementing standard methodologies like Peer Review Process, Test Readiness Process and Test Case Standards
- Peer reviews and group reviews.
- Client walkthroughs for all the artifacts like Technical specifications, Test conditions and other artifacts.
- Creating a new test environment including all online and backend processes along with connectivity to external systems for data exchange.
- Fixing the batch job abends reported by Production support.
- Preparing new jobs and Data & Prose forms to schedule in production.
- Extracting Adhoc reports using COBOL program and SQL queries as per business requirements.
- Coordinating the activities with other team members and reporting to the client Manager.
- Prepared the metrics for the application to evaluate the quality standards.
- Preparing clear documentation on all the processes like program specification, bugs fixing logs & change requests for future reference.
- Communicating with vendors, business users & operations via phone, email, chat etc.
Environment: COBOL, JCL/Procs, DB2, IMS DB/DC, DB2 Stored Procedures, VSAM, TELON, IBM Debug, FILE-AID, File Manager, IBM utilities, MQ Series, NDM, CA7 Scheduler, REXX, TSO/ISPF and SPUFI.
Confidential
Mainframe Developer (Software Engineer)
Responsibilities:
- Involved in gathering business requirements, analysis, design and development of the application.
- Involved in preparing Program Specification docs and unit test cases.
- Creating, viewing, copying and comparing the datasets using ISPF panel.
- Monitored all the production jobs running on CA7 Scheduler.
- Resolved the user abends and system abends which occurred in the SPOOL area for the submitted jobs.
- Written new programs and enhances the existing programs using COBOL, DB2 Stored Procedures, CICS and JCL.
- Involved in performance tuning for DB2 queries and stored procedures.
- SPUFI and QMF is used for pre-testing the SQL statements which are embedded into COBOL programs.
- Created the LLD and technical specs for the new programs.
- Transfers the output files of daily jobs into excel document using FTP then load this document to Oracle tables using SQL Loader.
- Deploying the built components into integration system and production system using CHANGEMAN.
- Used XPEDITOR tool for testing and debugging the source code.
- Prepared the metrics for the application to evaluate the quality standards.
- Primary contact for production support in the application.
Environment: COBOL, JCL/Procs, DB2, CICS, MQ, DB2 Stored Procedures, VSAM, XPEDITOR, CHANGEMAN, FILE-AID, FILE MANAGER, IBM utilities, NDM, CA7 Scheduler, REXX, TSO/ISPF, QMF and SPUFI.
Confidential
Mainframe Developer
Responsibilities:
- Analyzing the business requirements and transforming the requirements to Functional design specifications.
- Design, code, unit test, debug of the code and produced high quality deliverables with least number of defects.
- Development of both online and batch programs with database connectivity like CICS-DB2 and COBOL-DB2.
- Querying data from database (SQL) for creating data for unit testing using QMF, SPUFI etc., and creating best unit test scenarios covering all aspects of the business requirement.
- Prepared Unit Test scenarios, Unit test case/scripts for testing.
- Developed effective technical and user documentation.
- Developed tools using REXX to minimize the routine/redundant activities. Example: Job scheduler tool, Dataset Compare and Spool Check.
Environment: COBOL, JCL/Procs, DB2, CICS, REXX, TSO/ ISPF, CHANGEMAN, FILE AID, SPUFI, QMF and TSO/ISPF.