MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.
What is adam paterson mcp crew ai
MCP Crew AI Server
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows. This project leverages the Model Context Protocol (MCP) to communicate with Large Language Models (LLMs) and tools such as Claude Desktop or Cursor IDE, allowing you to orchestrate multi-agent workflows with ease.
Features
- Automatic Configuration: Automatically loads agent and task configurations from two YAML files (
agents.yml
andtasks.yml
), so you don't need to write custom code for basic setups. - Command Line Flexibility: Pass custom paths to your configuration files via command line arguments (
--agents
and--tasks
). - Seamless Workflow Execution: Easily run pre-configured workflows through the MCP
run_workflow
tool. - Local Development: Run the server locally in STDIO mode, making it ideal for development and testing.
Installation
There are several ways to install the MCP Crew AI server:
Option 1: Install from PyPI (Recommended)
pip install mcp-crew-ai
Option 2: Install from GitHub
pip install git+https://github.com/adam-paterson/mcp-crew-ai.git
Option 3: Clone and Install
git clone https://github.com/adam-paterson/mcp-crew-ai.git
cd mcp-crew-ai
pip install -e .
Requirements
- Python 3.11+
- MCP SDK
- CrewAI
- PyYAML
Configuration
- agents.yml: Define your agents with roles, goals, and backstories.
- tasks.yml: Define tasks with descriptions, expected outputs, and assign them to agents.
Example agents.yml
:
zookeeper:
role: Zookeeper
goal: Manage zoo operations
backstory: >
You are a seasoned zookeeper with a passion for wildlife conservation...
Example tasks.yml
:
write_stories:
description: >
Write an engaging zoo update capturing the day's highlights.
expected_output: 5 engaging stories
agent: zookeeper
output_file: zoo_report.md
Usage
Once installed, you can run the MCP CrewAI server using either of these methods:
Standard Python Command
mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Using UV Execution (uvx)
For a more streamlined experience, you can use the UV execution command:
uvx mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Or run just the server directly:
uvx mcp-crew-ai-server
This will start the server using default configuration from environment variables.
Command Line Options
--agents
: Path to the agents YAML file (required)--tasks
: Path to the tasks YAML file (required)--topic
: The main topic for the crew to work on (default: "Artificial Intelligence")--process
: Process type to use (choices: "sequential" or "hierarchical", default: "sequential")--verbose
: Enable verbose output--variables
: JSON string or path to JSON file with additional variables to replace in YAML files--version
: Show version information and exit
Advanced Usage
You can also provide additional variables to be used in your YAML templates:
mcp-crew-ai --agents examples/agents.yml --tasks examples/tasks.yml --topic "Machine Learning" --variables '{"year": 2025, "focus": "deep learning"}'
These variables will replace placeholders in your YAML files. For example, {topic}
will be replaced with "Machine Learning" and {year}
with "2025".
Contributing
Contributions are welcome! Please open issues or submit pull requests with improvements, bug fixes, or new features.
Licence
This project is licensed under the MIT Licence. See the LICENSE file for details.
Happy workflow orchestration!
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now