Figma Dev Mode
The Dev Mode MCP is Figma’s official MCP server for connecting your design files directly to AI assistants, such as VS Code with Copilot or Claude Code.
It runs a local server on your machine that exposes design information, like component details, variables, and layout specs, through the Model Context Protocol.
Features
- π¨ Direct frame-to-code generation – Select any Figma frame and convert it to production-ready code
- π§ Design token extraction – Pull variables, colors, typography, and spacing directly into your IDE
- π¦ Component library access – Access your Figma components and their specifications programmatically
- π Code Connect integration – Generate code that uses your actual component library instead of generic HTML/CSS
- π» Multi-editor support – Works with VS Code, Cursor, Windsurf, and Claude Code
- π Local server architecture – Runs securely on your machine without external dependencies
- π Selection-based workflow – Generate code from whatever you have selected in Figma
- π URL-based access – Reference specific design nodes using Figma links
Use Cases
- Frontend developers building new features – Convert mockups directly to React, Vue, or vanilla JavaScript components without manual translation
- Design system maintainers – Extract design tokens and component specifications to keep code libraries synchronized with Figma designs
- Product teams doing rapid prototyping – Quickly generate functional code from design iterations to test user flows and interactions
Prerequisites
You need a Dev or Full seat on Professional, Organization, or Enterprise Figma plans. The server only works with the Figma desktop app and requires an AI-enabled code editor that supports MCP servers.
Enable the MCP Server in Figma
- Download and open the latest version of the Figma desktop app.
- Create or open any Figma Design file.
- Click the Figma menu in the upper-left corner.
- Navigate to Preferences and select Enable Dev Mode MCP Server.
- Confirm the server is running. You’ll see a message at the bottom of the screen.
- The server runs locally at
http://127.0.0.1:3845/sseand remains active as long as Figma is open.
Configure Your MCP Clients
VS Code
- Open VS Code settings (
β + ,on Mac) - Search for “MCP” and select Edit in settings.json
- Add this configuration:
{
"chat.mcp.discovery.enabled": true,
"mcp": {
"servers": {
"Figma Dev Mode MCP": {
"type": "sse",
"url": "http://127.0.0.1:3845/sse"
}
}
},
"chat.agent.enabled": true
}- Open the chat toolbar (
β₯βB) and switch to Agent mode - Look for “MCP Server: Figma Dev Mode MCP” in the selection tool menu
Cursor
- Open Cursor β Settings β Cursor Settings
- Go to the MCP tab and click + Add new global MCP server
- Enter this configuration:
{
"mcpServers": {
"Figma": {
"url": "http://127.0.0.1:3845/sse"
}
}
}Windsurf
- Open settings (
β + ,) and navigate to Cascade settings - Select Open plugin store and search for Figma
- Install the plugin and open Cascade to see available tools
Claude Code
claude mcp add --transport sse figma-dev-mode-mcp-server http://127.0.0.1:3845/sseGenerate Code from Designs
Selection-based approach:
- Select any frame or component in your Figma file
- In your code editor, prompt your AI assistant: “Generate React code for my current Figma selection”
- The AI will access the selected design and generate appropriate code
Link-based approach:
- Right-click any Figma frame and copy its link
- Paste the link in your prompt: “Create a Vue component based on this Figma design: [paste link]”
- The server extracts the node ID and provides design context to your AI
Available Tools
- get_selection_info – Retrieves details about currently selected Figma objects
- get_node_info – Fetches information about specific design nodes using their IDs
- extract_design_tokens – Pulls variables, colors, and typography from your design system
- get_component_data – Accesses component definitions and their properties
FAQs
Q: Why can’t my editor connect to the server?
A: First, confirm the Figma desktop app is running and the “Enable Dev Mode MCP Server” option is checked in preferences. Second, verify the URL in your client’s settings is exactly http://127.0.0.1:3845/sse. A restart of both Figma and your code editor often fixes connection issues.
Q: Do I need a paid Figma account for this?
A: Yes. The feature is available on Dev or Full seats within the Professional, Organization, or Enterprise plans. It is not available on the Free plan.
Q: Can I use this with the Figma web app?
A: No. The MCP server runs locally and can only be enabled through the Figma desktop application at this time.
Q: What is Code Connect and how is it different from normal code generation?
A: Standard code generation creates new HTML and CSS based on the design’s appearance. Code Connect is a more advanced feature where you map your Figma components to your actual code components (e.g., a “Primary Button” in Figma maps to your <PrimaryButton> React component). When you prompt the AI, it will generate code that uses your existing, production-ready components instead of creating new styles from scratch.
Q: Can I use this server with Figma files I don’t own?
A: You need appropriate permissions to access the Figma file. The server respects Figma’s existing permission system, so you can only access designs you can normally view in Figma.
Q: Does this work with FigJam files?
A: No, the Dev Mode MCP Server only works with Figma Design files. FigJam files don’t contain the structured design data needed for code generation.
Q: Is my design data sent to external servers?
A: No, the MCP server runs entirely locally. Design data flows directly from your Figma app to your code editor without external transmission.
Latest MCP Servers
Notion
Log Mcp
Apple
Featured MCP Servers
Notion
Claude Peers
Excalidraw
FAQs
Q: What exactly is the Model Context Protocol (MCP)?
A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.
Q: How is MCP different from OpenAI's function calling or plugins?
A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.
Q: Can I use MCP with frameworks like LangChain?
A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.
Q: Why was MCP created? What problem does it solve?
A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.
Q: Is MCP secure? What are the main risks?
A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.
Q: Who is behind MCP?
A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.



