We provide IT Staff Augmentation Services!

Mulesoft Architect Resume

4.00/5 (Submit Your Rating)

ND

SUMMARY

  • More than 19 years of experience in IT in various business domains Media, Home Security, IOT, Health care and cargo, which includes requirements Gathering, Analysis, Design, Development, Documentation, Integration, Deployment, Production/Customer Support and Maintenance..
  • Utilized MuleSoft features like Data weave, API designer and various connectors to ensure robust and loosely coupled integration layer every time.
  • Developed Mule flows to integrate Data from various sources into Database from ActiveMQ topics and queues.
  • Developed an Integration Component wif third party application using Mule ESB and Experience in integrating Cloud applications like SFDC using Mule ESB.
  • Expertise in using the DataWeave language to transform data to and from any format (XML, CSV, JSON, Pojos, Maps etc).
  • Expertise in following Mule Technologies - MulesoftAnypoint Platform, Mule Anypoint Studio, Mule ESB, Mule Connectors, Mule Expression Language, Mule Routers, Anypoint Salesforce Connector, Mule Scopes, Mule Components, Mule Flow Control, Mule Transformers, Mule Filters, Mule Runtime Engine, Mule CloudHub, MUnit.
  • Specialized in deployments using Mule CloudHub
  • Experienced wif Anypoint Connector for Workday.
  • Experience wif RAML, Mulesoft API Platform, RESTFul API Programming and Design
  • Proficient in designing and executing integration projects using API - led architecture and building connectors using MuleSoft ESB & Anypoint Platform.
  • Extensively worked on architecting APIs using RAML & deployment to iPaaS API Runtime
  • Good hands on experience at designing API led connectivity and 3 Layered architecture
  • Strong knowledge of Service-Oriented Architecture ( SOA ) concepts, SOAP and REST based web services
  • Worked on MuleSoftAnypoint Platform - 3.7.x, 3.8.x, 4.x versions, Mule MMC, IPAAS (CloubHub) and devkit

PROFESSIONAL EXPERIENCE

Confidential, ND

MuleSoft Architect

Responsibilities:

  • Setting up Mule ESB for Development Environment and Implementing transformations on Mule Payload.
  • Participated in the deployment of Mule ESB applications using Cloud Hub and had Strong experience in Integration using Mule ESB in integrating various third-party solutions.
  • Worked in SAP connectors. integration of different applications and technologies wif SAP systems via open standards.
  • Working on Integration, Mule3, Mule4 and familiar wif version 4 of the MuleSoft product as well as deployed and used Runtime Fabric.
  • Working on data integration by using SAP connectors.
  • Implemented data transformation using Dataweave, custom Java classes.
  • Worked on Flow Control using Choice, Scatter-Gather based on use case.
  • Involved in the development of applications using AGILE development methodology.
  • Implemented Test Driven Development using frameworks like MUnit.
  • Worked on SVN for version control.
  • Configure the API portal and set security policy for each API.
  • Create and follow MuleSoft API Error Handling and Logging Standards while building any APIs.
  • Build, Deploy and Test applications or flows using Mule ESB and build and deploy the services.
  • Design and develop enterprise services API specifications using RAML and REST schema.
  • Create a Mule application dat uses connectors SFDC, Message Transformer, Choice Exception Strategies, and Batch processing.
  • Create HTTP inbound & outbound flows, custom java, and XSLT transformers, and Security of Mule endpoint.
  • Developed the flows/orchestrations for integrating the components written on top of different internal platforms using Mule ESB and ActiveMQ

Environment: Mule ESB, Any point studio, Oracle, SOAP, REST, Salesforce, GitHub, Active MQ, RAML, JAXB, MUnit. Java 1.8, Anypoint Studio 5.4.0/5.4.3 , MS-Access, Mule ESB 3.7.3, ActiveMQ 5.3, RAML, Cloudhub, Log4j 1.2.14, GIT, JIRA, API Gateway2.1, BitBucket

Confidential, ND

MuleSoft Developer

Responsibilities:

  • Understanding Client requirements and converting them to technical.
  • Preparation of Use case and High-level design documentation.
  • Actively participation in analysis and designing modules of the application.
  • Involved in creating the designing the API’s using RAML in API manager. Involved in API design sessions to decide various resources wifin each API, message schemas, message formats and authentication.
  • Involved in creating design and mapping documents based on the requirement analysis discussions. Designed and developed RAML based APIs using API manager. APIs were created to insert, update, delete and upsert data freight related data to sales force from SAP.
  • Worked wif Sales force team for designing the API's as needed for the frontend development.
  • Worked wif Gateway & Enterprise API teams and resolved the integration issues.
  • Involved in code walk thoughts using breakpoints, Mule debugging and error fixing.
  • Used Postman and SOAPUi to test the services. Experience wif API, Web Service, and messaging security standards, protocols and technologies, including TLS/SSL, OAuth 2.0, WS-Security, SFTP, Basic Authentication, Client ID and secret Authentication in Mule.
  • Worked on creating the JSON schema and XSD based on the mapping documents generated. Created the MUNIT flows for testing the APIs, Sales force, SAP and exceptions etc. functionality Using mocking the response technique to test the functionalities.
  • Integrating wif DevOps using Jenkins for MuleSoft deployments in different environments.

Environment: MuleSoft 3.8, Any point Studio, Any point Platform, Cloud Hub, API Manager, Runtime Manager, JSON, OAuth 2.0, SOAPUi, Postman, MUNIT, Jenkins, JIRA, SOAP UI and Groovy Scripting.

Confidential, Fargo, ND

Data Analyst

Responsibilities:

  • Hands on Experience in using various stages like Join, Merge, Lookup, Remove Duplicates, Sort, Filter, Dataset, Modify and Aggregator.
  • Excellent knowledge in Extraction, Cleansing and Modification of data from/to various Data Sources like Flat Files, Sequential files, Comma Delimited files (.csv), XML and Databases like Oracle, ODBC, Teradata, SQL Server, MySQL etc.
  • Involved in Analysis, design and documenting business reports such as Summary reports, undefined exception reports e.tc.,
  • Worked in Agile environment, wif an ability to accommodate and test the newly proposed changes at any point of time during the release. Filtered inaccurate data from legacy system using complex T-SQL statements, and implemented various constraints and triggers for data consistency.
  • Responsible for ETL through SSIS and loading Data to DB from different input sources.
  • Accessing Data, creating table from files, filtering Tables in SAS Creating Analytics in SAS.
  • Generating various reports (graphical) using Python packages like NumPy, matplotlib. Ability to write and optimize diverse SQL queries, working knowledge of RDBMS like SQL Server, MySQL and Oracle.
  • Worked wif business team to test the reports developed in OLAP. Interacted wif Business analysts to understand data requirements to ensure high quality Data is provided to the customers.
  • Designed, developed and maintained complex SQL codes for extracting and feeding data in to repositories from various sources, which included flat files, Json files and Oracle tables and SQL server Tables.
  • Designed and developed an end to end agent exception reporting, analysis and testing. Read the Parquet and csv files from S3 after applying the business logic and upload the output back to S3 using Pandas.
  • Extensively used open source tools Anaconda (python) Jupyter Notebooks for statistical analysis.

Confidential, Fargo ND

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Metrics reporting, data mining and trends in halpdesk environment using Access Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked wif data quality issues.
  • Worked wif end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along wif mapping documents to assist the developers in their coding.
  • Identify & record defects wif required information for issue to be reproduced by development team. Designed and developed database models for the operational data store, data warehouse, and federated databases to support client enterprise Information Management Strategy.
  • Flexible to work late hours to coordinate wif offshore team.

Environment: MS SQL Server 2008 client & SERVER, MS office, Legacy - Mainframes, Titanium, Rational Clear Quest, Clear Case.

Confidential, ND

Data Analyst

Responsibilities:

  • Created and analyzed business requirements to compose functional and implementable technical data solutions.
  • Identified integration impact, data flows and data stewardship.
  • Created new data constraints and or leveraged existing constraints for reuse. Created data dictionary, Data mapping for ETL and application support, DFD, ERD, mapping documents, metadata, DDL and DML as required.
  • Anticipated JAD sessions as primary modeler in expanding existing DB and developing new ones.
  • Evaluated and enhanced current data models to reflect business requirements. Generated, wrote and run SQL script to implement the DB changes including table update, addition or update of indexes, creation of views and store procedures.
  • Consolidated and updated various data models through reverse and forward engineering.
  • Compared different WFHM DB environments and determined, resolved and documented discrepancies. Analyzed DB discrepancies and synchronized the Staging, Development, UAT and Production DB environments wif data models.
  • Reviewed and revised data models for soundness of data structures and adherence to client standards. Restructured Logical and physical data models to respond to changing business needs and to assured data integrity using PowerDesigner.
  • Created naming convention files and co-coordinated wif DBAs to apply the data model chang.

Environment: PowerDesigner 15.0, Oracle 10g, Oracle SQL Developer, Rational ClearQuest, MS Excel and Access.

We'd love your feedback!