Youtube Transcript
youtube-transcript-mcp is an MCP server that hands any LLM the full text of a YouTube video in the same breath you ask a question.
Features
- 🎬 Instant transcript fetch from any public YouTube URL.
- đź”§ Zero-config once the server is wired into your MCP client.
- đź§Ş Works with both Bun (
bunx) and Node (npx) runtimes. - 📝 Plain-text output. Easy for any LLM to ingest and reason over.
- 🛡️ Runs locally. No tokens leave your machine.
How to use it:
1. Add the MCP server to your MCP client’s config. Pick the runtime you already have.
With Bun
{
"mcpServers": {
"youtube-transcript": {
"command": "bunx",
"args": ["--bun", "-y", "youtube-transcript-mcp"]
}
}
}With Node
{
"mcpServers": {
"youtube-transcript": {
"command": "npx",
"args": ["-y", "youtube-transcript-mcp"]
}
}
}2. Once the MCP client shows the tool as available, any prompt that includes a YouTube URL is fair game. Examples:
- “Summarize this keynote: https://youtu.be/dQw4w9WgXcQ”
- “List the three main criticisms in this video: https://youtu.be/xyz123”
- “Extract code snippets from this tutorial: https://youtu.be/abc789”
3. The server returns the transcript as plain text and your LLM does the rest.
FAQs
Q: Does it work with age-restricted or private videos?
A: Only public videos are supported. If the transcript isn’t visible to a logged-out user, the call fails gracefully with a clear message.
Q: Any rate limits?
A: Google’s public endpoints are throttled like any normal browser request. If you’re batch-processing hundreds of URLs, add a small delay between calls to avoid 429s.
Q: Captions vs. auto-generated transcripts. Does it matter?
A: The server pulls whatever YouTube returns first. If a creator has uploaded professional captions, you’ll get those; otherwise, you’ll get the auto transcript. The text still works fine for summarization.
Q: Can I run this inside Docker or on CI?
A: Absolutely. Just make sure the container has network egress to youtube.com. No extra secrets or API keys are needed.
Latest MCP Servers
Apify
Blueprint
HOPX
Featured MCP Servers
Apify
Blueprint
Monday.com
FAQs
Q: What exactly is the Model Context Protocol (MCP)?
A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.
Q: How is MCP different from OpenAI's function calling or plugins?
A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.
Q: Can I use MCP with frameworks like LangChain?
A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.
Q: Why was MCP created? What problem does it solve?
A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.
Q: Is MCP secure? What are the main risks?
A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.
Q: Who is behind MCP?
A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.



