Figma Dev Mode

The Dev Mode MCP is Figma’s official MCP server for connecting your design files directly to AI assistants, such as VS Code with Copilot or Claude Code.

It runs a local server on your machine that exposes design information, like component details, variables, and layout specs, through the Model Context Protocol.

Features

  • 🎨 Direct frame-to-code generation – Select any Figma frame and convert it to production-ready code
  • πŸ”§ Design token extraction – Pull variables, colors, typography, and spacing directly into your IDE
  • πŸ“¦ Component library access – Access your Figma components and their specifications programmatically
  • πŸ”— Code Connect integration – Generate code that uses your actual component library instead of generic HTML/CSS
  • πŸ’» Multi-editor support – Works with VS Code, Cursor, Windsurf, and Claude Code
  • 🌐 Local server architecture – Runs securely on your machine without external dependencies
  • πŸ“‹ Selection-based workflow – Generate code from whatever you have selected in Figma
  • πŸ”— URL-based access – Reference specific design nodes using Figma links

Use Cases

  • Frontend developers building new features – Convert mockups directly to React, Vue, or vanilla JavaScript components without manual translation
  • Design system maintainers – Extract design tokens and component specifications to keep code libraries synchronized with Figma designs
  • Product teams doing rapid prototyping – Quickly generate functional code from design iterations to test user flows and interactions

Prerequisites

You need a Dev or Full seat on Professional, Organization, or Enterprise Figma plans. The server only works with the Figma desktop app and requires an AI-enabled code editor that supports MCP servers.

Enable the MCP Server in Figma

  1. Download and open the latest version of the Figma desktop app.
  2. Create or open any Figma Design file.
  3. Click the Figma menu in the upper-left corner.
  4. Navigate to Preferences and select Enable Dev Mode MCP Server.
  5. Confirm the server is running. You’ll see a message at the bottom of the screen.
  6. The server runs locally at http://127.0.0.1:3845/sse and remains active as long as Figma is open.

Configure Your MCP Clients

VS Code

  1. Open VS Code settings (⌘ + , on Mac)
  2. Search for “MCP” and select Edit in settings.json
  3. Add this configuration:
{
  "chat.mcp.discovery.enabled": true,
  "mcp": {
    "servers": {
      "Figma Dev Mode MCP": {
        "type": "sse", 
        "url": "http://127.0.0.1:3845/sse"
      }
    }
  },
  "chat.agent.enabled": true
}
  1. Open the chat toolbar (βŒ₯⌘B) and switch to Agent mode
  2. Look for “MCP Server: Figma Dev Mode MCP” in the selection tool menu

Cursor

  1. Open Cursor β†’ Settings β†’ Cursor Settings
  2. Go to the MCP tab and click + Add new global MCP server
  3. Enter this configuration:
{
  "mcpServers": {
    "Figma": {
      "url": "http://127.0.0.1:3845/sse"
    }
  }
}

Windsurf

  1. Open settings (⌘ + ,) and navigate to Cascade settings
  2. Select Open plugin store and search for Figma
  3. Install the plugin and open Cascade to see available tools

Claude Code

claude mcp add --transport sse figma-dev-mode-mcp-server http://127.0.0.1:3845/sse

Generate Code from Designs

Selection-based approach:

  1. Select any frame or component in your Figma file
  2. In your code editor, prompt your AI assistant: “Generate React code for my current Figma selection”
  3. The AI will access the selected design and generate appropriate code

Link-based approach:

  1. Right-click any Figma frame and copy its link
  2. Paste the link in your prompt: “Create a Vue component based on this Figma design: [paste link]”
  3. The server extracts the node ID and provides design context to your AI

Available Tools

  • get_selection_info – Retrieves details about currently selected Figma objects
  • get_node_info – Fetches information about specific design nodes using their IDs
  • extract_design_tokens – Pulls variables, colors, and typography from your design system
  • get_component_data – Accesses component definitions and their properties

FAQs

Q: Why can’t my editor connect to the server?
A: First, confirm the Figma desktop app is running and the “Enable Dev Mode MCP Server” option is checked in preferences. Second, verify the URL in your client’s settings is exactly http://127.0.0.1:3845/sse. A restart of both Figma and your code editor often fixes connection issues.

Q: Do I need a paid Figma account for this?
A: Yes. The feature is available on Dev or Full seats within the Professional, Organization, or Enterprise plans. It is not available on the Free plan.

Q: Can I use this with the Figma web app?
A: No. The MCP server runs locally and can only be enabled through the Figma desktop application at this time.

Q: What is Code Connect and how is it different from normal code generation?
A: Standard code generation creates new HTML and CSS based on the design’s appearance. Code Connect is a more advanced feature where you map your Figma components to your actual code components (e.g., a “Primary Button” in Figma maps to your <PrimaryButton> React component). When you prompt the AI, it will generate code that uses your existing, production-ready components instead of creating new styles from scratch.

Q: Can I use this server with Figma files I don’t own?
A: You need appropriate permissions to access the Figma file. The server respects Figma’s existing permission system, so you can only access designs you can normally view in Figma.

Q: Does this work with FigJam files?
A: No, the Dev Mode MCP Server only works with Figma Design files. FigJam files don’t contain the structured design data needed for code generation.

Q: Is my design data sent to external servers?
A: No, the MCP server runs entirely locally. Design data flows directly from your Figma app to your code editor without external transmission.

Latest MCP Servers

Notion

Notion's official MCP Server allows you to interact with Notion workspaces through the Notion API.

Log Mcp

An MCP server that provides 7 tools for log analysis, including error fingerprinting, pattern comparison, and ML classification.

Apple

An MCP package that provides AI assistants with direct access to Notes, Messages, Mail, Contacts, Reminders, Calendar, and Maps via AppleScript and EventKit.

View More MCP Servers >>

Featured MCP Servers

Notion

Notion's official MCP Server allows you to interact with Notion workspaces through the Notion API.

Claude Peers

An MCP server that enables Claude Code instances to discover each other and exchange messages instantly via a local broker daemon with SQLite persistence.

Excalidraw

Excalidraw's official MCP server that streams interactive hand-drawn diagrams to Claude, ChatGPT, and VS Code with smooth camera control and fullscreen editing.

More Featured MCP Servers >>

FAQs

Q: What exactly is the Model Context Protocol (MCP)?

A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.

Q: How is MCP different from OpenAI's function calling or plugins?

A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.

Q: Can I use MCP with frameworks like LangChain?

A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.

Q: Why was MCP created? What problem does it solve?

A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.

Q: Is MCP secure? What are the main risks?

A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.

Q: Who is behind MCP?

A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!