Wisej.AI Enhanced: Accessing MCP Servers

While an LLM may be able to parse out your intent, it may not always be able to enact the steps to get the task done. You can ask ChatGPT, Gemini, or any other AI system, to send an email. It will understand what an email is, even write it out, but without something like the tools that are provided by a model context protocol, it won’t be able to do it.

What’s a Model Context Protocol (MCP)

A model context protocol is a universal interface that allows LLMs to interact with external tools, systems, and data sources. There is a great wealth of MCP servers out there that provide tools that fetch information from the web, to handling payment. With the McpToolsClient in Wisej.AI, you can provide tools to the LLM with a single line of code. Just find the MCP link and provide it to the McpToolsClient and your LLM can now use the tool whenever it needs.

In addition to providing tools to Wisej.AI, utilizing an MCP server comes with many other benefits. The tools provided by the server are run on the server, meaning your Wisej.AI implementation can be lightweight and included with little performance loss. An obvious benefit is the sheer amount of plug and play options available.

Wisej.AI McpToolsClient

Creating one tool is easy, but when the work is already done for you, adding one line for the McpToolsClient is even easier. This allows for extremely rapid development. No longer do you have to write hundreds of lines to communicate with an API, when one line can do the same trick. There are many websites that have lists of publicly hosted MCP servers with any tool you could imagine.

Want to learn more? See the Wisej.AI documentation for more details on using the McpToolsClient.