Supermemory

This is Supermemory’s official MCP server that helps you create a universal, persistent memory for all your Large Language Models (LLMs). You can carry your “memories”, your preferences, project details, and conversation history, to any compatible LLM client.

Features

  • 🚀 Universal Memory: Your context and history become available to any MCP-compatible LLM client you use.
  • No Login Required: Access is granted via a unique, private URL, eliminating the need for user accounts or passwords.
  • 😱 Completely Free: The hosted service is free to use.
  • ⚙️ Simple Setup: An integrated command-line tool simplifies installation into various clients with a single command.
  • 🔧 Self-Hostable: For users who want full control, the server can be self-hosted with a personal API key.

Use Cases

  • Seamlessly Switching Between AIs: If you use ChatGPT for coding and Claude for creative writing, Supermemory MCP ensures Claude knows the context of the code you were just working on, and vice versa. You don’t have to copy-paste information between conversations.
  • Onboarding New LLM Tools: When you try a new LLM application, you can point it to your Supermemory MCP URL. The new tool will instantly have access to your established context and preferences, bypassing the typical “getting to know you” phase.
  • Maintaining Long-Term Project Knowledge: For projects spanning weeks or months, the server can retain key facts, decisions, and data points. You can ask a question about a decision made a month ago, and the LLM can retrieve the context from its memory.
  • Creating a Personalized Assistant: You can store personal facts like “My primary programming language is Python” or “I’m currently working on a project named ‘Apollo'”. The AI will then recall these details in future interactions for a more personalized experience.

How to Use It

For the Hosted Service:

  1. Navigate to https://mcp.supermemory.ai.
  2. The website will automatically generate a unique, secret URL for your memory session. Copy this URL and save it somewhere secure, as it is the key to your memory store.
  3. On the same page, select your LLM client (e.g., Claude, Cursor, Windsurf) from the dropdown menu.
  4. The site will generate a specific npx install-mcp command. Copy this command.
  5. Open your terminal, paste the command, and press Enter. The tool configures your client to use the Supermemory MCP server.

For Self-Hosting:

  1. Obtain an API key from the Supermemory console.
  2. Create a .env file for the project.
  3. Add your API key to the file with the variable SUPERMEMORY_API_KEY=.
  4. Run the server instance. It will now use your personal API key.

FAQs

Q: What happens if I lose my unique URL or clear my cookies?
A: Your unique URL is your access key. If you lose it and your browser session is cleared, you will lose access to that specific memory store. It is best to save the URL in a password manager. The site does have a “Restore previous session” feature that may work if your cookies are still present.

Q: Is my data secure?
A: The system uses a long, randomly generated URL as a secret key, which is a tradeoff for convenience over traditional authentication. Anyone who has your URL can access your memories. For handling highly sensitive information, self-hosting the server is the recommended approach.

Q: Which LLM clients are compatible?
A: The server works with clients that support MCP and can connect via Server-Sent Events (SSE). The setup page includes a dropdown list of tested, compatible clients.

Latest MCP Servers

Code Execution Mode

Reduce MCP context overhead from 30k to 200 tokens. This bridge enables secure, rootless Python code execution and on-demand tool discovery for Claude.

WPMCP

Use the WordPress MCP Server to give AI assistants full control over your site for content, theme, and plugin management.

MATLAB

Install and configure the MATLAB MCP Server for a direct connection between your local MATLAB session and MCP applications like Claude Desktop.

View More MCP Servers >>

Featured MCP Servers

Monday.com

Use the monday.com MCP server to connect AI agents to your Work OS. Provides secure data access, action tools, and workflow context.

MongoDB

Install the MongoDB MCP Server to query databases and manage Atlas directly from your AI assistant in VS Code, Cursor, and more.

CSS

Connect your AI assistant to the CSS MCP Server to get instant MDN docs and analyze project-wide stylesheet complexity.

More Featured MCP Servers >>

FAQs

Q: What exactly is the Model Context Protocol (MCP)?

A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.

Q: How is MCP different from OpenAI's function calling or plugins?

A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.

Q: Can I use MCP with frameworks like LangChain?

A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.

Q: Why was MCP created? What problem does it solve?

A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.

Q: Is MCP secure? What are the main risks?

A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.

Q: Who is behind MCP?

A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!