Tome + Smithery = The Easiest Way to Play with MCP

If you’re curious about Model Context Protocol (MCP), there’s now a magical way to get started. Thanks to our new Smithery integration, you can now connect your Ollama-powered local LLMs to thousands of MCP servers with just a click!

What’s MCP?
The Model Context Protocol (MCP) is a standard that lets large language models interact with external tools like search engines, browsers, file systems, or APIs. It’s how your LLM can do magical things like open browsers and click buttons, or create and edit files, without needing a custom integration for each tool.

Whether you want your LLM to:

You can now do all of the above and more with just a few clicks in Tome, powered by our Smithery integration.

What This Integration Enables

With this integration, Tome becomes a magical launchpad for over 4,000 MCP servers from the Smithery Registry, extending what your local models can do without writing a line of code.

Whether you’re a developer experimenting with new tools, a researcher running complex workflows, or an AI enthusiast tinkering away, Tome and Smithery makes it dead simple to enchant your local LLMs with superpowers.

How to Get Started

First, grab the latest release of Tome.

Within Tome:
Open the MCP tab -> Click Smithery -> Pick an MCP -> Click Install

From Smithery
Find an MCP -> Choose Tome from the Install menu

We’d Love Your Feedback

We’re eager to hear what you build, break and dream up with Tome! Join us on Discord and share your favorite workflows, weird experiments, or ideas for improving the experience. Be sure to follow along and star/watch the project on GitHub!