What is LMStudio MCP?
LMStudio MCP is a Model Context Protocol (MCP) server implementation designed specifically to integrate Claude with locally running language models managed by LM Studio. It provides a standardized protocol interface that allows Claude to perform various actions—such as querying models, generating completions, and managing model state—directly via your local infrastructure. This setup creates a powerful hybrid environment, where you can combine Claude’s strengths with custom or private models running on your own hardware.
How to Configure LMStudio MCP
-
Prerequisites:
- Install and launch LM Studio, with at least one model loaded and running (typically on port 1234).
- Ensure Python 3.7+ is available, or prepare a Docker environment.
- Claude must have MCP access enabled.
-
Installation:
- Recommended (one-line install):
curl -fsSL https://raw.githubusercontent.com/infinitimeless/LMStudio-MCP/main/install.sh | bash
- Manual Install (Python):
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP pip install requests "mcp[cli]" openai
- Docker:
docker run -it --network host ghcr.io/infinitimeless/lmstudio-mcp:latest
- Docker Compose:
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP docker-compose up -d
- Recommended (one-line install):
-
MCP Configuration:
- For Claude, add an MCP configuration entry. For example:
{ "lmstudio-mcp": { "command": "uvx", "args": ["https://github.com/infinitimeless/LMStudio-MCP"] } }
- Alternatively, use the provided docker or python-based scripts with the appropriate command arguments.
- For Claude, add an MCP configuration entry. For example:
-
Reference:
- For advanced deployment and troubleshooting, see the project's MCP_CONFIGURATION.md and DOCKER.md.
How to Use LMStudio MCP
- Start LM Studio and ensure your desired model is loaded on the default port (1234).
- Launch LMStudio MCP using your chosen install method (local, Docker, etc.).
- Configure Claude with MCP using the relevant configuration snippet.
- Connect via Claude:
- When Claude prompts you for MCP connection, select the LMStudio MCP server.
- Interact:
- Use Claude’s interface to list models, query the active model, or generate completions using your own local LLMs.
- Monitor & Maintain:
- Ensure LM Studio remains running and models are accessible to maintain a seamless connection.
Key Features
- Local-Remote Bridging: Connect Claude to your own local language models via the MCP protocol.
- Health Checking: Quickly verify the status and accessibility of your LM Studio API.
- Model Discovery: List and query all models available in LM Studio from within Claude.
- Seamless Text Generation: Generate completions using your private models, leveraging Claude’s interface.
- Flexible Deployment: Multiple install and deployment options (baremetal Python, Docker, Compose, Kubernetes, or GitHub-hosted).
- Enhanced Privacy: No data is sent to any third-party LLM provider—your completions are fully local.
- Open Source & Extensible: Freely modify and contribute to the project for custom use cases.
Use Cases
- Hybrid LLM Integration: Use Claude's user-friendly interface to interact with custom or proprietary models you run locally, benefitting from the strengths of both.
- On-Premise Secure Workflows: Generate completions and manage language models within a firewalled or enterprise environment without cloud reliance.
- Testing and Evaluation: Easily test, compare, and switch between different local models with minimal reconfiguration effort.
- Development Prototyping: Enable developers to automate, benchmark, or prototype agent workflows using both Claude and custom models.
FAQ
Q1: Why can't Claude connect to my LM Studio MCP server?
A1: Ensure LM Studio is running and listening on the default port (1234), and that a model is loaded. Check firewall or host networking settings, and try switching API URLs from "localhost" to "127.0.0.1".
Q2: Some models do not respond or behave unexpectedly—what should I do?
A2: Certain models may not fully support the OpenAI-compatible API protocol required by LMStudio MCP. Try different models or adjust parameters like temperature
and max_tokens
. See the compatibility notes in the documentation.
Q3: Is internet access required for using LMStudio MCP?
A3: Only for installation if using the GitHub direct or Docker image options. Once set up, all model interactions are local, and no internet is required for model execution or completions.
Q4: Can I run LMStudio MCP inside a container for development or production?
A4: Yes, LMStudio MCP provides official Docker images, Docker Compose, and Kubernetes manifests to facilitate isolated and scalable deployment options.