mcp_code_executor
by bazinga012
The MCP Code Executor is an MCP server that allows LLMs to execute Python code within a specified Conda environment.
What is mcp_code_executor
MCP Code Executor
The MCP Code Executor is an MCP server that allows LLMs to execute Python code within a specified Conda environment. This enables LLMs to run code with access to libraries and dependencies defined in the Conda environment.
Features
- Execute Python code from LLM prompts
- Run code within a specified Conda environment
- Configurable code storage directory
Prerequisites
- Node.js installed
- Conda installed
- Desired Conda environment created
Setup
- Clone this repository:
git clone https://github.com/bazinga012/mcp_code_executor.git
- Navigate to the project directory:
cd mcp_code_executor
- Install the Node.js dependencies:
npm install
- Build the project:
npm run build
Configuration
To configure the MCP Code Executor server, add the following to your MCP servers configuration file:
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
Replace the placeholders:
/path/to/mcp_code_executor
with the absolute path to where you cloned this repository/path/to/code/storage
with the directory where you want the generated code to be storedyour-conda-env
with the name of the Conda environment you want the code to run in
Usage
Once configured, the MCP Code Executor will allow LLMs to execute Python code by generating a file in the specified CODE_STORAGE_DIR
and running it within the Conda environment defined by CONDA_ENV_NAME
.
LLMs can generate and execute code by referencing this MCP server in their prompts.
Contributing
Contributions are welcome! Please open an issue or submit a pull request.
License
This project is licensed under the MIT License.
Leave a Comment
Comments section will be available soon. Stay tuned!
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
Ableton Live MCP Server
MCP Server implementation for Ableton Live OSC control
Airbnb MCP Server
AI Agent Marketplace Index Search MCP Server
MCP Server for AI Agent Marketplace Index from DeepNLP
Algorand MCP Implementation
Algorand Model Context Protocol (Server & Client)
mcp-server-apache-airflow
pypi.org/project/mcp-server-apache-airflow/
airtable-mcp-server
๐๏ธ๐ค Airtable Model Context Protocol Server, for allowing AI systems to interact with your Airtable bases
Airtable MCP Server
Search, create and update Airtable bases, tables, fields, and records using Claude Desktop and MCP (Model Context Protocol) clients
Alphavantage MCP Server
A MCP server for the stock market data API, Alphavantage API.
Amadeus MCP Server
Amadeus MCP(Model Context Protocol) Server
Anki MCP Server
An MCP server for Anki
Submit Your MCP Server
Share your MCP server with the community
Submit Now