MCP server to bridge Claude with local LLMs running in LM Studio
What is infinitimeless claude lmstudio bridge
Claude-LMStudio Bridge
An MCP server that bridges Claude with local LLMs running in LM Studio.
Overview
This tool allows Claude to interact with your local LLMs running in LM Studio, providing:
- Access to list all available models in LM Studio
- The ability to generate text using your local LLMs
- Support for chat completions through your local models
- A health check tool to verify connectivity with LM Studio
Prerequisites
- Claude Desktop with MCP support
- LM Studio installed and running locally with API server enabled
- Python 3.8+ installed
Quick Start (Recommended)
For macOS/Linux:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
chmod +x setup.sh
./setup.sh
- Follow the setup script's instructions to configure Claude Desktop
For Windows:
- Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
- Run the setup script
setup.bat
- Follow the setup script's instructions to configure Claude Desktop
Manual Setup
If you prefer to set things up manually:
- Create a virtual environment (optional but recommended)
python -m venv venv
source venv`/bin/activate` # On Windows: venv\Scripts\activate
- Install the required packages
pip install -r requirements.txt
- Configure Claude Desktop:
- Open Claude Desktop preferences
- Navigate to the 'MCP Servers' section
- Add a new MCP server with the following configuration:
- Name: lmstudio-bridge
- Command:
/bin/bash
(on macOS/Linux) or cmd.exe (on Windows) - Arguments:
- macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
- Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat
Usage with Claude
After setting up the bridge, you can use the following commands in Claude:
- Check the connection to LM Studio:
Can you check if my LM Studio server is running?
- List available models:
List the available models in my local LM Studio
- Generate text with a local model:
Generate a short poem about spring using my local LLM
- Send a chat completion:
Ask my local LLM: "What are the main features of transformers in machine learning?"
Troubleshooting
Diagnosing LM Studio Connection Issues
Use the included debugging tool to check your LM Studio connection:
python debug_lmstudio.py
For more detailed tests:
python debug_lmstudio.py --test-chat --verbose
Common Issues
"Cannot connect to LM Studio API"
- Make sure LM Studio is running
- Verify the API server is enabled in LM Studio (Settings > API Server)
- Check that the port (default: 1234) matches what's in your .env file
"No models are loaded"
- Open LM Studio and load a model
- Verify the model is running successfully
"MCP package not found"
- Try reinstalling:
pip install "mcp[cli]" httpx python-dotenv
- Make sure you're using Python 3.8 or later
"Claude can't find the bridge"
- Check Claude Desktop configuration
- Make sure the path to run_server.sh or run_server.bat is correct and absolute
- Verify the server script is executable:
chmod +x run_server.sh
(on macOS/Linux)
Advanced Configuration
You can customize the bridge behavior by creating a .env
file with these settings:
LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false
Set DEBUG=true
to enable verbose logging for troubleshooting.
License
MIT
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now