MCP Servers

A directory of curated & open-source Model Context Protocol servers. Search and discover MCP servers to enhance your AI capabilities.

Kiwi.com Flight Search

Use the Kiwi.com Flight Search MCP Server to find and book flights directly from your AI assistant using natural language.

Gemini Bridge

Connect your AI coding assistant to Google Gemini AI with this lightweight MCP server. Direct CLI integration, file analysis, and zero API costs.

Linked API

Install the Linked API MCP server to run LinkedIn searches, send messages, and analyze profiles directly from your AI assistant.

JMAP

A Deno-based MCP server with tools to search, send, and manage emails on any JMAP-compliant server.

Playwrightess

An MCP server for persistent Playwright evaluation. Keep browser sessions alive across agent requests with JavaScript-based playwright_eval.

Reddit

Install the MCP Reddit Server to connect your AI assistants to Reddit. Fetch hot threads, post content, and comments with simple commands.

Jina AI

Access Jina's Reader, Embeddings and Reranker APIs through a single MCP interface for clean web content and semantic search.

Context Optimizer

The Context Optimizer MCP Server gives AI assistants tools to extract targeted info from files and terminals, preventing context overload.

Spec Workflow

An open-source MCP server for spec-driven development workflows with real-time dashboard monitoring and approval systems.

Shadcn UI

Use this MCP server to connect your AI assistants to the shadcn/ui library. Get components, blocks, and metadata without leaving your editor.

Imagician

Set up the Imagician MCP server to handle image manipulation. Supports batch resizing, format conversion (WebP, AVIF), and cropping.

Auto Favicon

An MCP server that generates a complete favicon set, including PWA manifest and Apple touch icons, from a single PNG or URL.

Vercel

The official Vercel MCP server provides a secure endpoint for AI tools like Claude to access your project logs, docs, and team data.

Spec-Driven Development

An MCP server for spec-driven development. Generate requirements, design docs, and code in a structured workflow using the EARS format.

Gemini CLI MCP

Enterprise MCP server connecting Gemini CLI with 400+ AI models through OpenRouter integration. Features 33 specialized tools and Redis-backed conversations.

Memory Service

Install a local-first, self-organizing memory for your AI assistant. Features dream-inspired consolidation, dual storage backends, and multi-client support.

Memorizer

Build lasting AI agent memory with Memorizer MCP server. Vector embeddings, semantic search, and relationship mapping for enhanced AI context retention.

Apple Health

Query Apple Health data using SQL or natural language with this MCP server. Includes automated reports and efficient data processing via DuckDB.

FAQs

Q: What exactly is the Model Context Protocol (MCP)?

A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.

Q: How is MCP different from OpenAI's function calling or plugins?

A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.

Q: Can I use MCP with frameworks like LangChain?

A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.

Q: Why was MCP created? What problem does it solve?

A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.

Q: Is MCP secure? What are the main risks?

A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.

Q: Who is behind MCP?

A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!