dhakalnirajan blender open mcp
by dhakalnirajan
Open Models MCP for Blender Using Ollama
What is dhakalnirajan blender open mcp
blender-open-mcp
blender-open-mcp
is an open source project that integrates Blender with local AI models (via Ollama) using the Model Context Protocol (MCP). This allows you to control Blender using natural language prompts, leveraging the power of AI to assist with 3D modeling tasks.
Features
- Control Blender with Natural Language: Send prompts to a locally running Ollama model to perform actions in Blender.
- MCP Integration: Uses the Model Context Protocol for structured communication between the AI model and Blender.
- Ollama Support: Designed to work with Ollama for easy local model management.
- Blender Add-on: Includes a Blender add-on to provide a user interface and handle communication with the server.
- PolyHaven Integration (Optional): Download and use assets (HDRIs, textures, models) from PolyHaven directly within Blender via AI prompts.
- Basic 3D Operations:
- Get Scene and Object Info
- Create Primitives
- Modify and delete objects
- Apply materials
- Render Support: Render images using the tool and retrieve information based on the output.
Installation
Prerequisites
- Blender: Blender 3.0 or later. Download from blender.org.
- Ollama: Install from ollama.com, following OS-specific instructions.
- Python: Python 3.10 or later.
- uv: Install using
pip install uv
. - Git: Required for cloning the repository.
Installation Steps
-
Clone the Repository:
git clone https://github.com/dhakalnirajan/blender-open-mcp.git cd blender-open-mcp
-
Create and Activate a Virtual Environment (Recommended):
uv venv source .venv`/bin/activate` # On Linux/macOS .venv\Scripts\activate # On Windows
-
Install Dependencies:
uv pip install -e .
-
Install the Blender Add-on:
- Open Blender.
- Go to
Edit -> Preferences -> Add-ons
. - Click
Install...
. - Select the
addon.py
file from theblender-open-mcp
directory. - Enable the "Blender MCP" add-on.
-
Download an Ollama Model (if not already installed):
ollama run llama3.2
(Other models like
Gemma3
can also be used.)
Setup
-
Start the Ollama Server: Ensure Ollama is running in the background.
-
Start the MCP Server:
blender-mcp
Or,
python src/blender_open_mcp/server.py
By default, it listens on
http://0.0.0.0:8000
, but you can modify settings:blender-mcp --host 127.0.0.1 --port 8001 --ollama-url http://localhost:11434 --ollama-model llama3.2
-
Start the Blender Add-on Server:
- Open Blender and the 3D Viewport.
- Press
N
to open the sidebar. - Find the "Blender MCP" panel.
- Click "Start MCP Server".
Usage
Interact with blender-open-mcp
using the mcp
command-line tool:
Example Commands
-
Basic Prompt:
mcp prompt "Hello BlenderMCP!" --host http://localhost:8000
-
Get Scene Information:
mcp tool get_scene_info --host http://localhost:8000
-
Create a Cube:
mcp prompt "Create a cube named 'my_cube'." --host http://localhost:8000
-
Render an Image:
mcp prompt "Render the image." --host http://localhost:8000
-
Using PolyHaven (if enabled):
mcp prompt "Download a texture from PolyHaven." --host http://localhost:8000
Available Tools
Tool Name | Description | Parameters |
---|---|---|
get_scene_info |
Retrieves scene details. | None |
get_object_info |
Retrieves information about an object. | object_name (str) |
create_object |
Creates a 3D object. | type , name , location , rotation , scale |
modify_object |
Modifies an object’s properties. | name , location , rotation , scale , visible |
delete_object |
Deletes an object. | name (str) |
set_material |
Assigns a material to an object. | object_name , material_name , color |
render_image |
Renders an image. | file_path (str) |
execute_blender_code |
Executes Python code in Blender. | code (str) |
get_polyhaven_categories |
Lists PolyHaven asset categories. | asset_type (str) |
search_polyhaven_assets |
Searches PolyHaven assets. | asset_type , categories |
download_polyhaven_asset |
Downloads a PolyHaven asset. | asset_id , asset_type , resolution , file_format |
set_texture |
Applies a downloaded texture. | object_name , texture_id |
set_ollama_model |
Sets the Ollama model. | model_name (str) |
set_ollama_url |
Sets the Ollama server URL. | url (str) |
get_ollama_models |
Lists available Ollama models. | None |
Troubleshooting
If you encounter issues:
- Ensure Ollama and the
blender-open-mcp
server are running. - Check Blender’s add-on settings.
- Verify command-line arguments.
- Refer to logs for error details.
For further assistance, visit the GitHub Issues page.
Happy Blending with AI! 🚀
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now