Linked API

The Linked API MCP Server connects your AI assistant, like Claude or Cursor, directly to your LinkedIn account.

It works by giving the AI a set of tools to control LinkedIn through a secure cloud browser.

You can ask your assistant to perform tasks like searching for people, sending messages, or analyzing company profiles, and it will execute them for you.

Features

  • 🔎 Advanced Search: Execute complex searches for people and companies with specific filters.
  • 🤖 AI-Powered Actions: Let your AI send connection requests, messages, and post reactions on your behalf.
  • 📊 Data Retrieval: Pull detailed information from personal profiles, company pages, and posts.
  • 📈 Sales Navigator Support: Includes a separate set of tools for users with Sales Navigator accounts.
  • 💬 Conversation Management: Your AI can read existing conversations and help you draft replies.
  • ⚙️ Custom Workflows: Chain multiple actions together for complex automation sequences.

Use Cases

  • Automated Lead Generation. You can give your AI a prompt like, “Find 10 software engineers at fintech companies in London with 50-200 employees.” The server will perform the search, and you can follow up by asking the AI to analyze their profiles and draft personalized connection notes for each one. This cuts down on hours of manual searching.
  • Efficient Recruiting Outreach. Instead of manually searching for candidates, you can have your assistant find profiles with specific skills and experience. It can then handle the initial outreach, freeing you up to focus on conversations with interested candidates.
  • Context-Aware Messaging. If you need to follow up with a list of contacts, you can have your AI read your past conversations with each person. It can then suggest a relevant follow-up message based on the existing context, which feels much more natural than a generic template.

How To Use It

1. The MCP server requires API tokens to link to your LinkedIn account.

  • Sign up on the Linked API Platform.
  • Select a plan and finish the purchase process.
  • Follow the steps to connect your LinkedIn account.
  • From your dashboard, copy the LINKED_API_TOKEN and the IDENTIFICATION_TOKEN.

If you manage more than one LinkedIn account, you will receive a unique identification token for each. This lets you run a separate MCP server instance for every account.

A quick note on the HEALTH_CHECK_PERIOD variable you’ll see in the configurations: some LinkedIn tasks take a few minutes. AI assistants often have short timeout limits. This setting sends a signal to keep the connection open during longer jobs, preventing the tool from disconnecting before your workflow is complete.

2. Configure your MCP clients:

Claude

  • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "linkedapi": {
      "command": "npx",
      "args": ["-y", "linkedapi-mcp@latest"],
      "env": {
        "LINKED_API_TOKEN": "your-linked-api-token-here",
        "IDENTIFICATION_TOKEN": "your-identification-token-here",
        "HEALTH_CHECK_PERIOD": "180"
      }
    }
  }
}

Cursor

{
  "mcpServers": {
    "linkedapi": {
      "command": "npx",
      "args": ["-y", "linkedapi-mcp@latest"],
      "env": {
        "LINKED_API_TOKEN": "your-linked-api-token-here",
        "IDENTIFICATION_TOKEN": "your-identification-token-here",
        "HEALTH_CHECK_PERIOD": "600"
      }
    }
  }
}

You can also do this in the UI. Go to Cursor Settings > Tools & Integrations > MCP Tools Section, and click “New MCP Server” to paste the configuration.

VS Code

{
  "mcpServers": {
    "linkedapi": {
      "command": "npx",
      "args": ["-y", "linkedapi-mcp@latest"],
      "env": {
        "LINKED_API_TOKEN": "your-linked-api-token-here",
        "IDENTIFICATION_TOKEN": "your-identification-token-here",
        "HEALTH_CHECK_PERIOD": "600"
      }
    }
  }
}

Windsurf

Add the configuration to ~/.codeium/windsurf/mcp_config.json.

{
  "mcpServers": {
    "linkedapi": {
      "command": "npx",
      "args": ["-y", "linkedapi-mcp@latest"],
      "env": {
        "LINKED_API_TOKEN": "your-linked-api-token-here",
        "IDENTIFICATION_TOKEN": "your-identification-token-here",
        "HEALTH_CHECK_PERIOD": "600"
      }
    }
  }
}

This can also be configured through theUI in Windsurf Settings > Advanced Settings > Cascade section by clicking “Add Server”.

Available Tools

Standard

ToolDescription
send_messageSend message to person
sync_conversationSync conversation for polling
check_connection_statusCheck connection status with person
send_connection_requestSend connection request with optional note
withdraw_connection_requestWithdraw pending connection request
retrieve_pending_requestsGet all pending connection requests
retrieve_connectionsGet your connections with filtering
remove_connectionRemove person from connections
search_companiesSearch for companies with advanced filtering
search_peopleSearch for people with advanced filtering
fetch_companyGet company information with optional employees, decision makers, posts
fetch_personGet person page information with optional experience, education, skills, posts
fetch_postGet post information and engagement metrics
react_to_postReact to post (like, love, support, celebrate, insightful, funny)
comment_on_postLeave comment on post
retrieve_ssiGet current SSI (Social Selling Index)
retrieve_performanceGet LinkedIn dashboard analytics

Sales Navigator

ToolDescription
nv_send_messageSend message to person via Sales Navigator
nv_sync_conversationSync Sales Navigator conversation for polling
nv_search_companiesSearch for companies with advanced filtering via Sales Navigator
nv_search_peopleSearch for people with advanced filtering via Sales Navigator
nv_fetch_companyGet company information with optional employees and decision makers from Sales Navigator
nv_fetch_personGet person page information from Sales Navigator

Other Tools

ToolDescription
poll_conversationsMonitor Standard and Sales Navigator conversations for new messages
execute_custom_workflowExecute custom workflow definition
get_workflow_resultGet workflow result by ID
get_api_usage_statsGet Linked API usage statistics

Usage Examples

Once configured, you can make requests in natural language:

  • “Find all decision makers at Acme Corp and send them connection requests.”
  • “Search for product managers at fintech companies in New York with 50-200 employees.”
  • “Tell me about ‘https://linkedin.com/in/jane-doe’, including their work history and experience.”
  • “Send a connection request to ‘https://linkedin.com/in/jane-doe’ and mention their recent article about AI in healthcare.”
  • “Get all my pending connection requests and withdraw each one.”

Your AI can also combine these tools to execute more complex, multi-step workflows.

FAQs

Q: Is a paid Linked API account required?
A: Yes, the MCP server uses tokens that you get from a paid Linked API plan. These tokens authorize the server to act on your behalf.

Q: What is the purpose of the HEALTH_CHECK_PERIOD variable?
A: AI assistants often disconnect from a tool if it takes too long to respond. Some LinkedIn searches or workflows are slow. This setting keeps the connection active during these long-running tasks to prevent them from failing midway.

Q: Can I use this with more than one LinkedIn account?
A: Yes. Each LinkedIn account you connect to the Linked API platform gets its own unique IDENTIFICATION_TOKEN. You can run a separate MCP server configuration for each token.

Q: How does this interact with my LinkedIn account?
A: The service operates through a cloud browser to perform actions, which is a common method for automation. You should still be aware of LinkedIn’s usage policies to avoid sending too many requests in a short period.

Latest MCP Servers

YouTube MCP Server

Connect your AI agent to YouTube with this MCP server. Supports 99 languages, in-memory audio processing, and efficient caching for developers.

SEO Research

Integrate Ahrefs SEO data into your IDE with SEO Research MCP. Check backlinks, traffic, and keywords directly in Cursor, Claude, and VS Code.

Cursor n8n

Control your n8n workflows directly from Cursor IDE. This MCP server enables AI assistants to create, manage, and debug automations via the n8n API.

View More MCP Servers >>

Featured MCP Servers

Apify

Connect AI assistants to 8000+ web scraping tools via Apify MCP Server. Extract social media data, contact details, and automate web research.

Blueprint

Use the Blueprint MCP Server to generate system architecture diagrams directly from your codebase using Nano Banana Pro.

Monday.com

Use the monday.com MCP server to connect AI agents to your Work OS. Provides secure data access, action tools, and workflow context.

More Featured MCP Servers >>

FAQs

Q: What exactly is the Model Context Protocol (MCP)?

A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.

Q: How is MCP different from OpenAI's function calling or plugins?

A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.

Q: Can I use MCP with frameworks like LangChain?

A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.

Q: Why was MCP created? What problem does it solve?

A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.

Q: Is MCP secure? What are the main risks?

A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.

Q: Who is behind MCP?

A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!