TeslaMate

The TeslaMate MCP Server connects the fantastic data-logging power of TeslaMate directly to your AI assistant.

It lets you ask plain-English questions about your car’s data and get immediate answers.

You can query driving stats, check battery health, or review charging history without needing to write a single line of SQL or build a custom dashboard.

Features

  • 🚗 Vehicle Information Access – Retrieve basic Tesla details, software updates, and current status
  • 🔋 Battery Health Monitoring – Track battery degradation, daily usage patterns, and health metrics
  • 📊 Driving Analytics – Access monthly summaries, distance tracking, and driving pattern analysis
  • Charging Data Integration – Query charging sessions, location preferences, and charging habits
  • 🌡️ Efficiency Analysis – Analyze temperature impact on efficiency and power consumption patterns
  • 📍 Location Intelligence – Identify most visited locations and charging preferences
  • 💬 Natural Language Queries – Ask questions in conversational English instead of writing SQL
  • 🔌 MCP Protocol Support – Compatible with Claude Desktop and other MCP-enabled AI assistants

Use Cases

  • Fleet Management Analysis – Business owners with multiple Teslas can analyze usage patterns, efficiency metrics, and maintenance needs across their fleet without manual data compilation
  • Personal Driving Optimization – Tesla owners can identify inefficient driving habits, optimal charging locations, and battery health trends to improve their vehicle’s performance and longevity
  • Research and Data Analysis – Researchers studying electric vehicle usage patterns can query comprehensive datasets through natural language instead of complex SQL operations
  • Maintenance Planning – Vehicle owners can track tire pressure trends, battery degradation rates, and software update history to plan maintenance schedules and identify potential issues early

How to Use It

1. You need a working TeslaMate instance that is logging data to a PostgreSQL database. You also need Python 3.11 or a newer version installed on the machine where you’ll run the server.

2. Clone the repository to a local directory:

git clone https://github.com/yourusername/teslamate-mcp.git
cd teslamate-mcp

3. Install the necessary Python packages.

uv sync

OR

pip install -r requirements.txt

4. Create a file named .env in the root of the project folder (teslamate-mcp).

5. Add your PostgreSQL database connection string to this file. It should look like this:

DATABASE_URL=postgresql://username:password@hostname:port/teslamate

Replace username, password, hostname, and port with your actual database credentials. If you’re running the server on the same machine as your database, the hostname is often localhost.

6. To make the server available to your AI assistant, you have to tell the client how to run it. For Claude Desktop:

  • macOS Path: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows Path: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "teslamate": {
      "command": "uv",
      "args": ["run", "python", "/path/to/teslamate-mcp/main.py"],
      "env": {
        "DATABASE_URL": "postgresql://username:password@hostname:port/teslamate"
      }
    }
  }
}

7. Start the server manually from your terminal to test it:

uv run python main.py

8. If everything is configured correctly, your MCP client will now be able to connect to it and you can start asking questions about your Tesla.

Basic Information:

  • “What’s my Tesla’s current software version?”
  • “Show me my car’s basic specifications”

Battery Analysis:

  • “How much has my battery degraded over the past year?”
  • “What are my daily charging patterns?”

Driving Metrics:

  • “What’s my average efficiency this month?”
  • “Show me my longest road trips”

Location Data:

  • “Where do I charge most frequently?”
  • “What locations do I visit regularly?”

9. Available Query Categories

  • Location Intelligence – Most visited places, charging location analysis
  • Vehicle Status – Basic information, software updates, current state
  • Battery Health – Degradation tracking, usage patterns, tire pressure monitoring
  • Driving Analytics – Distance summaries, efficiency metrics, driving patterns
  • Charging Data – Session analysis, location preferences, charging habits
  • Efficiency Trends – Temperature impact analysis, power consumption patterns

10. To add custom queries:

  • Test queries with your TeslaMate database structure
  • Create SQL files in the queries/ directory
  • Add corresponding tool functions in main.py
  • Follow existing error handling patterns

FAQs

Q: Does this server modify my TeslaMate database in any way?
A: No, it’s strictly for reading data. The SQL queries included are all SELECT statements, so it won’t change, add, or delete any of your logged information.

Q: My TeslaMate database runs inside a Docker container. How do I connect to it?
A: You need to ensure the PostgreSQL port (usually 5432) from the Docker container is mapped to your host machine. In your DATABASE_URL, you would then use localhost or your machine’s IP address as the hostname.

Q: How do I handle database connection errors?
A: Check your DATABASE_URL format and ensure your TeslaMate database is accessible. The connection string should include the correct username, password, hostname, port, and database name. Test connectivity with a PostgreSQL client first.

Q: Can I query data from multiple Tesla vehicles?
A: Yes, if your TeslaMate installation tracks multiple vehicles, the server can access data from all vehicles in the database. You can specify which vehicle in your queries or ask for combined analytics.

Q: What happens if my TeslaMate database is large?
A: The server includes proper error handling and connection management. For large datasets, queries may take longer to execute. You can optimize performance by asking for specific date ranges or filtered results.

Q: How do I troubleshoot query failures?
A: Check the server logs for database connection issues, verify your SQL queries are valid for your TeslaMate schema version, and ensure the database user has proper read permissions on TeslaMate tables.

Latest MCP Servers

CVE

An MCP Server that connects Claude to 27 security tools for CVE triage, EPSS checks, KEV status, exploit lookup, and package scanning.

WebMCP

webmcp is an MCP server that connects MCP clients to web search, page fetching, and local LLM-based extraction. It’s ideal…

Google Meta Ads GA4

An MCP server that connects AI assistants to Google Ads, Meta Ads, and GA4 for reporting, edits, and cross-platform analysis.

View More MCP Servers >>

Featured MCP Servers

Notion

Notion's official MCP Server allows you to interact with Notion workspaces through the Notion API.

Claude Peers

An MCP server that enables Claude Code instances to discover each other and exchange messages instantly via a local broker daemon with SQLite persistence.

Excalidraw

Excalidraw's official MCP server that streams interactive hand-drawn diagrams to Claude, ChatGPT, and VS Code with smooth camera control and fullscreen editing.

More Featured MCP Servers >>

FAQs

Q: What exactly is the Model Context Protocol (MCP)?

A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.

Q: How is MCP different from OpenAI's function calling or plugins?

A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.

Q: Can I use MCP with frameworks like LangChain?

A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.

Q: Why was MCP created? What problem does it solve?

A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.

Q: Is MCP secure? What are the main risks?

A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.

Q: Who is behind MCP?

A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!