Microsoft Clarity
The Microsoft Clarity MCP Server lets you query your Clarity data using natural language through assistants like Claude.
Think of it as a direct line for AI to access and understand your site’s performance metrics—things like scroll depth, engagement time, and traffic sources.
You can use it to quickly get answers about user behavior without manually digging through dashboards or writing complex API calls.
Features
- 🗣️ Natural Language Queries: Ask for data in plain English.
- 📊 Dimension Filtering: Slice data by browser, OS, country, device, and more.
- 📈 Key Metric Retrieval: Access scroll depth, engagement time, total traffic, and other core stats.
- 🤖 AI Assistant Integration: Works smoothly with tools like Claude for Desktop.
Use Cases
- Quick Insights for Marketing Folks: Imagine your marketing team needs to see how mobile users engage with content versus desktop users. Instead of waiting for a data pull, they could ask an AI assistant connected to this server: “Show me scroll depth from the last two days, broken down by device.” The server then fetches and presents this data, often with useful observations, like tablets having the highest scroll depth or mobile needing optimization. This gets them answers fast.
- Devs Troubleshooting Site Issues: If you’re a developer and suspect a new feature is causing problems on a specific browser, you could prompt: “What’s the traffic and engagement time by browser for yesterday?” This can quickly highlight if, say, Firefox users are experiencing unusually low engagement time, pointing you toward a potential browser-specific bug. It cuts down the time spent manually cross-referencing analytics.
How to Use It
1. Prerequisites
- Node.js (version 16 or newer)
- A Microsoft Clarity project already set up
- A Data Export API Token from your Clarity project
- An MCP client, like Claude for Desktop
2. Installation and Running. Replace YOUR_TOKEN_HERE with your actual Clarity API token.
npx @microsoft/clarity-mcp-server --clarity_api_token=YOUR_TOKEN_HERE3. Configure your MCP clients, such as Claude or Cursor.
{
"mcpServers": {
"@microsoft/clarity-mcp-server": {
"command": "npx",
"args": [
"@microsoft/clarity-mcp-server",
"--clarity_api_token=your-api-token-here"
]
}
}
}4. Getting Your API Token:
- Head to your Microsoft Clarity project.
- Go to Settings, then find the Data Export section.
- Click the button to Generate new API token.
- Copy this token and keep it somewhere safe. Treat it like a password!
FAQs
Q: Is the Microsoft Clarity MCP Server free to use?
A: Yes, the MCP server itself and Microsoft Clarity’s analytics platform are free. Keep in mind the API request limits for the data export feature, though.
Q: How fresh is the data I can query?
A: You can request data for up to the last 3 days with each query.
Q: What are “dimensions” in the context of API limits?
A: Dimensions are how you segment your data. For example, if you ask for “traffic by country and device,” ‘country’ and ‘device’ are two dimensions. You can use up to three of these in a single API request.
Latest MCP Servers
CVE
WebMCP
webmcp is an MCP server that connects MCP clients to web search, page fetching, and local LLM-based extraction. It’s ideal…
Google Meta Ads GA4
Featured MCP Servers
Notion
Claude Peers
Excalidraw
FAQs
Q: What exactly is the Model Context Protocol (MCP)?
A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.
Q: How is MCP different from OpenAI's function calling or plugins?
A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.
Q: Can I use MCP with frameworks like LangChain?
A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.
Q: Why was MCP created? What problem does it solve?
A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.
Q: Is MCP secure? What are the main risks?
A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.
Q: Who is behind MCP?
A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.



