Mulesoft Developer Resume
Philadelphia, PA
SUMMARY
- 9+ years of experience in Software Development Life Cycle (SDLC) including requirement analysis, design specification, code development, code integration, system testing and deployment.
- Strong experience with Mule Runtime (3.9 & 4.x), Anypoint Platform, Anypoint Studio and Cloudhub.
- Expertise in integrating various systems that includes RESTful APIs, Databases and third - party applications using MuleSoft Enterprise Service Bus.
- Experience in designing and developing RAML based APIs with Policies such as OAuth 2, Rate Limiting, Basic Authentication and much more.
- Proficient in Anypoint API Governance and API management through SLAs and Proxies.
- Experience on creating Flows, Sub flows, Error handling, Dataweave transformation, Message enriching, Debugging and other activities in Mule 3 and Mule 4.
- Worked on developing batch integrations to transfer data in bulk between enterprise applications using MuleSoft.
- Strong understanding of the MuleSoft API Framework (system, process, experience) and API life cycle.
- Well versed in Mules Scopes, Error Handling, Message Filters, Validation, Transformation, Messaging Queues and Flow Control through HTTP, VM, JMS, Database, Amazon S3 Bucket, SQS, SNS, File and SFTP.
- Good exposure to Continuous Integration and Deployment tools like Jenkins, Bitbucket, and Maven to deploy projects automatically on Cloud hub.
- Dedicated with Strong work ethics, Quick learner, Adaptable and Excellent team player.
TECHNICAL SKILLS
MuleSoft: Mule Runtime (3.9 & 4.x), Anypoint Studio, RAML 1.0, API Manager, Runtime Manager, Exchange
CI/CD tools: Jenkins, Maven, GitHub, Postman
Statistics & Analytics: Tableau, SAP Business Analyzer, SAP BPC, IBM Analytics, SPSS, SAS
Utilities & Tools: SAGE, Oracle Hyperion, Excel advanced functions (macros, index, conditional list, arrays, pivots, lookups), Word, Cognos, SQL, Infor M3
PROFESSIONAL EXPERIENCE
MuleSoft Developer
Confidential - Philadelphia, PA
Responsibilities:
- Understanding functional and business requirements of the client and creating integration plan and design documents.
- Developed Mule ESB applications for the services with synchronous and asynchronous mule flows, designed APIs using multi-layered architecture for API-led connectivity.
- Participated in deployment of Mule ESB applications using Cloud Hub and Strong experience in integration using various third-party solutions.
- Involve in API design sessions through RAML to decide various resources within each API, message schemas, message formats and authentication.
- Extensively used Data Transformations with Data weave in MuleSoft to propagate data between different systems with different schemas.
- Configuring API Gateway for the API that were deployed in the cloud hub and Enforced different policies and SLA Tiers.
- Utilizing API Manager, API Analytics and monitored application by creating alerts and notifications in the API Dashboard.
- Involved in unit testing and writing MUnit Test cases for the mule flows.
- Used CI/CD tools like Jenkins, Maven and GitHub for the development and deployment.
Environment: Mule Runtime 4.2, Anypoint Studio 7.4, MySQL, RAML 1.0, SOAP, REST, JSON, XML, Jenkins, Maven, JIRA, Bit Bucket
MuleSoft Developer
Confidential - Philadelphia, PA
Responsibilities:
- Involved in requirements gathering session to understand the project requirements.
- Prepared the Integration architecture, detailed design, and deployment documents.
- Designed and Implement RESTful Web Services using various data format (JSON, XML) to provide an interface to the various third-party applications.
- Created multiple system/process APIs to connect various third-party systems and databases including MySQL.
- Developed the flows/orchestrations for integrating the components written on top of different internal platforms using Mule ESB
- Extensively used Mule flow control components such as File Transport, HTTP, SMTP Transport, FTP/SFTP Transport, JDBC Connector and VM Queues.
- Implement Mule flows for each entity with retry mechanisms with private secured flows.
- Deployed and managed applications on Cloud hub to ensure efficient processing of data.
- Deployed Mule application to Cloud Hub using Anypoint Runtime Manager.
- Used Maven and Jenkins to achieve continuous deployment and integration.
Environment: Mule Runtime 4.1, Anypoint Studio 7.2, Anypoint Platform, RAML 1.0, JSON, Jenkins, Maven, Postman
Mulesoft Developer
Confidential - Atlanta, GA
Responsibilities:
- Responsible and involved in the analysis, defining, prototyping, implementation, and deployment of full software development life cycle (SDLC) of the project.
- Designed the applications using Mule ESB as a middleware between third-party systems and the systems at our customer side.
- Done with preparation of design document specifications, troubleshoots and testing.
- Developed the integration flows using Mule ESB 3.7.3 framework.
- Performed integrations using different connectors like Sales Force, Database, SAP, HTTP, SFTP, FTP and file connectors.
- Performing integrations to transform a huge pay load from Sales Force to Database by scheduling batch processing jobs.
- Involved in designing and documenting REST APIs using RAML.
- Involved in creating http inbound & outbound flows and orchestration using XPath using MULE ESB.
- Implemented asynchronous messaging using Active MQ and AWS Suite.
- Worked on various Mule connectors / adapters, developing API, APM and developing services on Cloud Hub.
- Used Mule ESB for routing, filtering and aggregate message between application based on content and rules.
- Involved in writing batch jobs to run at specific schedule times and transformations using Data Weave.
- Formulated build and deployment using Maven to deploy the artifacts to Cloud Hub.
- Used Git hub for version and source control.
- Used Mule ESB for routing, filtering and aggregate message between application based on content and rules.
- Configuring the Mule process for fetching the data from topic and makes web service calls to the middle tier Mule ESB for processing and put the data on the Cloud hub.
- Implemented data transformation using XPATH, XSLT, Data Weave, Custom java classes.
- Developed Mule flows to integrate Data from various sources into Database, from ActiveMQ topics and queues, some transformations were also done at the integration layer.
- Extensively used Mule components that include Data Weave, JAXB, File Transport, SMTP Transport, FTP/SFTP Transport, and JDBC Connector.
Environment: Mule ESB, Anypoint Studio, JAVA, Servlets, JDBC, AML, RabbitMQ, APM, Nexus, Apache-Maven, Cloud Hub, XML, WebLogic Application server 10.3, SQL, Web Services - WSDL, SOAP, Jenkins.
Reporting Analyst
Confidential
Responsibilities:
- Analyzed date and delivered executive level reports, dashboards, and presentations to provide business insights and actionable recommendations for the Leadership of different DPDHL business units
- Management and delivery of all reporting requirements and data needs for the Americas region using Cognos
- Advised and supported process improvement and loss mitigation initiatives based on data and trend analysis
- Designed and built KPI and productivity tools used to support internal teams' activities and processes
- Ensured quality of data within the database and implement appropriate safeguards and audits to achieve the highest levels of data integrity, including data reconciliation and validation of records
- Gathered user requirements business partners and deliver high quality data analytics driven BI reports and dashboards that provide insights
- Support the cross-segment pricing analytics process via relevant KPI development and reporting as well as Dashboard creation and maintenance.
- Provide clear guidance, support, leadership and change management in the execution of the pricing process.