SearXNG Mul MCP
SearXNG Mul MCP is an MCP server that connects MCP clients like Claude Desktop to the privacy-focused SearXNG metasearch engine.
It directly handles search requests through the Model Context Protocol, which enables your AI assistants to directly access real-time web results without compromising your privacy.
Features
- 🌐 Multi-Query Parallel Search: Run several search queries at once instead of waiting for each to finish sequentially
- 🔌 Dual Transport Support: Works with both stdio (for direct MCP client connections) and HTTP protocols
- ⚙️ SearXNG API Integration: Communicates directly with SearXNG’s REST API without browser automation
- 🔒 Basic Authentication: Connects to protected SearXNG instances requiring username/password
- 🐳 Docker Deployment: Ready-to-run container images for easy deployment
- 🛠️ Environment Configuration: Customize behavior through simple environment variables
Features List
- ⚡ Multi-Query Parallel Search: Run a batch of search queries simultaneously instead of one by one.
- 🔌 Dual Transport Support: Offers both
stdiofor local clients andHTTPfor web or networked integrations. - 🔒 Basic Authentication: Works with private SearXNG instances that are protected by Basic Auth.
- 🐳 Docker Ready: Comes with Docker and Docker Compose files for a quick and isolated deployment.
- ⚙️ Environment Configuration: All settings are managed through simple environment variables for easy setup.
- 🎯 Direct API Integration: Communicates directly with the SearXNG search API for reliable performance.
Use Cases
- AI Research Automation: An AI agent can investigate a topic from multiple angles at once. Instead of searching for “topic history,” then “topic applications,” then “topic controversies” sequentially, it can execute all queries in a single, parallel request to gather comprehensive data much faster.
- Private Data Enrichment for Internal Tools: If you run a private, self-hosted SearXNG instance with curated search engines, you can use this server to let your internal AI tools access that specific information without touching the public web. The Basic Auth support keeps it secure.
- Building Hybrid AI Applications: You could have a core AI service that uses the server’s HTTP endpoint for its web front-end, while a companion desktop application for power users connects to the same server instance via the
stdioprotocol. This provides flexibility without needing separate backends.
How To Use It
1. For a direct connection with an MCP client (the default stdio mode), run this in your terminal:
SEARXNG_URL=https://your.searxng.com npx -y searxng-mul-mcp2. If you need to access it over the network, start it in HTTP mode:
SEARXNG_URL=https://your.searxng.com npx -y searxng-mul-mcp --transport=http --host=0.0.0.0 --port=30003. Configuration is handled through environment variables.
SEARXNG_URL: The URL of your public or private SearXNG instance. (Required)USERNAME: The username for Basic Authentication. (Optional)PASSWORD: The password for Basic Authentication. (Optional)TRANSPORT: Set tostdioorhttp. (Optional)HOST: The host address for HTTP mode, typically0.0.0.0. (Optional)PORT: The port for HTTP mode, like3000. (Optional)DEBUG: Set totrueto enable detailed logging. (Optional)
4. To integrate it with Claude Desktop, you edit the claude_desktop_config.json file. Add the following server configuration:
{
"mcpServers": {
"searxng-mul-mcp": {
"command": "npx",
"args": ["-y", "searxng-mul-mcp"],
"env": {
"SEARXNG_URL": "https://your.searxng.com",
"USERNAME": "your_username",
"PASSWORD": "your_password"
}
}
}
}5. For a more permanent setup, Docker is the way to go. Create a docker-compose.yml file:
services:
searxng-mul-mcp:
image: ghcr.io/jae-jae/searxng-mul-mcp:latest
environment:
- SEARXNG_URL=https://your.searxng.com
# - USERNAME=your_username
# - PASSWORD=your_password
ports:
- "3000:3000"Then, launch it with docker-compose up -d.
6. The server exposes a single tool named search. It accepts several parameters to refine the search:
- queries: An array of search strings. This is the only required parameter.
- engines: A list of specific search engines to use (e.g., “google”, “bing”).
- categories: Filter results by categories like “general” or “news”.
- safesearch: Set the safe search level (0 for off, 1 for moderate, 2 for strict).
- language: Specify a language code, such as “en” for English.
FAQs
Q: What’s the main benefit of this server compared to other search tools?
A: The key advantage is parallel query execution. Most search tools handle one query at a time, but this one can process a whole batch at once, which dramatically speeds up information gathering for AI agents.
Q: Do I have to run my own SearXNG server?
A: No, you can point it to a public SearXNG instance. However, for privacy, reliability, and customization, deploying your own SearXNG server is the recommended approach.
Q: What happens if my SearXNG instance is down?
A: The server has built-in error handling, including an automatic retry mechanism for network errors and clear messages for authentication failures. If the connection fails persistently, you should check your SEARXNG_URL and ensure the instance is running correctly.
Latest MCP Servers
CVE
WebMCP
webmcp is an MCP server that connects MCP clients to web search, page fetching, and local LLM-based extraction. It’s ideal…
Google Meta Ads GA4
Featured MCP Servers
Notion
Claude Peers
Excalidraw
FAQs
Q: What exactly is the Model Context Protocol (MCP)?
A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.
Q: How is MCP different from OpenAI's function calling or plugins?
A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.
Q: Can I use MCP with frameworks like LangChain?
A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.
Q: Why was MCP created? What problem does it solve?
A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.
Q: Is MCP secure? What are the main risks?
A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.
Q: Who is behind MCP?
A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.



