Laravel Boost
The Laravel Boost MCP Server comes with over 15 tools designed for Laravel developers who use AI in their local development workflow.
It provides AI coding assistants with the necessary context and structure about your application to generate high-quality, framework-specific code.
Key Features
- 🛠️ 15+ specialized tools for real-time project inspection, including database schema, routes, config, and logs
- 📚 Semantic documentation search across 17,000+ Laravel knowledge elements with embeddings for precise results
- 🎯 Version-aware guidelines for Laravel, Livewire, Filament, and other ecosystem packages
- 🔧 Tinker integration for executing code within your application context
- 📝 Custom guideline support through simple Blade template files
- 🔍 Environment-aware coding that understands your specific PHP version, database engine, and packages
Use Cases
- Rapid Prototyping: You can ask an AI agent to scaffold a new feature. For instance, “Create a new
Productmodel and migration, register a resource controller for it, and add the corresponding routes.” Boost provides the AI with the context of your existing models and routing files to perform these actions correctly. - Intelligent Debugging: When you encounter a bug, you can instruct your AI assistant to investigate. A prompt like, “My last action resulted in an error. Read the last error from the log, inspect the
userstable schema, and check the route definition for/users/{id}” uses multiple tools to diagnose the problem without you manually checking each of those things. - Code Refactoring and Updates: You need to update a component to align with a new version of Livewire. You can tell the AI, “Refactor this Livewire component to use the new syntax from version 3, based on the installed Livewire v3 guidelines.” The AI will use the specific guidelines provided by Boost to generate updated, accurate code.
- Automated Testing: You can accelerate test creation by having the AI write them for you. For example, “Write a Pest test for the
StorePostRequestform request. Ensure it tests the validation rules defined in the class.” The AI can inspect the file and use the Pest guidelines to generate a relevant test case.
How to Use It
1. Install Laravel Boost through Composer as a dev dependency.
composer require laravel/boost --dev2. Run the boost:install command. This installs the necessary MCP server configurations and the AI guidelines.
php artisan boost:install3. You can now connect your AI coding tool, such as Cursor or Claude Code, to the server.
4. Available tools:
- Application Info: Reads your project’s PHP and Laravel versions, database engine, installed ecosystem packages with their versions, and a list of all Eloquent models.
- Browser Logs: Accesses and reads logs and errors directly from the browser’s console.
- Database Connections: Inspects the available database connections configured in your application, including the default one.
- Database Query: Executes a raw SQL query directly against your application’s database.
- Database Schema: Reads the structure of your database, including tables, columns, and their types.
- Get Absolute URL: Converts relative URIs within your application to absolute, fully-qualified URLs.
- Get Config: Retrieves a specific value from your application’s configuration files using “dot” notation (e.g.,
app.name). - Last Error: Reads the single most recent error message from the Laravel log files.
- List Artisan Commands: Fetches and displays a list of all available Artisan commands for your application.
- List Available Config Keys: Inspects and lists all the available configuration keys.
- List Available Env Vars: Inspects and lists all available environment variable keys from your
.envfile. - List Routes: Displays a list of all the defined routes in your application’s routing files.
- Read Log Entries: Reads a specified number (N) of the most recent entries from the application’s log files.
- Report Feedback: Allows you to send feedback about your experience with Laravel Boost directly to the development team.
- Search Docs: Queries the extensive Laravel documentation API to find relevant information based on your installed packages.
- Tinker: Executes arbitrary PHP code within the context of your Laravel application, similar to running the
php artisan tinkercommand.
5. You can extend Laravel Boost with your own AI guidelines. Add .blade.php files to your application’s .ai/guidelines/ directory. The boost:install command will automatically pick them up.
6. To override a default Boost guideline, create a file in your local .ai/guidelines/ directory that mirrors the path of the guideline you want to replace. For instance, to override the Inertia React v2 form guidance, you would create a file at .ai/guidelines/inertia-react/2/forms.blade.php. Your custom file will be used instead of the default one.
7. In some cases, you might need to register the MCP server manually in your editor. Use the following details for the configuration.
Here is a JSON example of what the configuration looks like:
{
"mcpServers": {
"laravel-boost": {
"command": "php",
"args": ["artisan", "boost:mcp"]
}
}
}FAQs
Q: How is this different from a standard AI chatbot?
A: Context. A standard chatbot has general knowledge of Laravel, but it knows nothing about your specific project. Laravel Boost acts as a bridge, giving the AI direct, real-time access to your application’s code, database schema, routes, and configurations. This leads to far more accurate and relevant code generation.
Q: Is Laravel Boost a paid product?
A: Laravel Boost is an official Laravel package and is open-source. You can install it via Composer.
Q: What happens if AI guidelines for my specific package version don’t exist?
A: Laravel Boost will fall back to the core guidelines for that package. These core guidelines provide generalized advice that is broadly applicable. You can also create your own version-specific guidelines in your project’s .ai/guidelines directory for more precise control.
Latest MCP Servers
WPMCP
MATLAB
Claude Skills
Featured MCP Servers
Monday.com
MongoDB
CSS
FAQs
Q: What exactly is the Model Context Protocol (MCP)?
A: MCP is an open standard, like a common language, that lets AI applications (clients) and external data sources or tools (servers) talk to each other. It helps AI models get the context (data, instructions, tools) they need from outside systems to give more accurate and relevant responses. Think of it as a universal adapter for AI connections.
Q: How is MCP different from OpenAI's function calling or plugins?
A: While OpenAI's tools allow models to use specific external functions, MCP is a broader, open standard. It covers not just tool use, but also providing structured data (Resources) and instruction templates (Prompts) as context. Being an open standard means it's not tied to one company's models or platform. OpenAI has even started adopting MCP in its Agents SDK.
Q: Can I use MCP with frameworks like LangChain?
A: Yes, MCP is designed to complement frameworks like LangChain or LlamaIndex. Instead of relying solely on custom connectors within these frameworks, you can use MCP as a standardized bridge to connect to various tools and data sources. There's potential for interoperability, like converting MCP tools into LangChain tools.
Q: Why was MCP created? What problem does it solve?
A: It was created because large language models often lack real-time information and connecting them to external data/tools required custom, complex integrations for each pair. MCP solves this by providing a standard way to connect, reducing development time, complexity, and cost, and enabling better interoperability between different AI models and tools.
Q: Is MCP secure? What are the main risks?
A: Security is a major consideration. While MCP includes principles like user consent and control, risks exist. These include potential server compromises leading to token theft, indirect prompt injection attacks, excessive permissions, context data leakage, session hijacking, and vulnerabilities in server implementations. Implementing robust security measures like OAuth 2.1, TLS, strict permissions, and monitoring is crucial.
Q: Who is behind MCP?
A: MCP was initially developed and open-sourced by Anthropic. However, it's an open standard with active contributions from the community, including companies like Microsoft and VMware Tanzu who maintain official SDKs.



