Yellhorn offers MCP tools to publish detailed workplans as GitHub issues with entire-codebase reasoning and to review diffs against them
What is msnidal yellhorn mcp
Yellhorn MCP
A Model Context Protocol (MCP) server that exposes Gemini 2.5 Pro and OpenAI capabilities to Claude Code for software development tasks using your entire codebase in the prompt. This pattern is highly useful for defining work to be done by code assistants like Claude Code or other MCP compatible coding agents, and reviewing the results ensuring they meet the exactly specified original requirements.
Features
- Create Workplans: Creates detailed implementation plans based on a prompt and taking into consideration your entire codebase, posting them as GitHub issues and exposing them as MCP resources for your coding agent
- Judge Code Diffs: Provides a tool to evaluate git diffs against the original workplan with full codebase context and provides detailed feedback, ensuring the implementation does not deviate from the original requirements and providing guidance on what to change to do so
- Seamless GitHub Integration: Automatically creates labeled issues, posts judgement sub-issues with references to original workplan issues
- Context Control: Use
.yellhornignore
files to exclude specific files and directories from the AI context, similar to.gitignore
- MCP Resources: Exposes workplans as standard MCP resources for easy listing and retrieval
Installation
# Install from PyPI
pip install yellhorn-mcp
# Install from source
git clone https://github.com/msnidal/yellhorn-mcp.git
cd yellhorn-mcp
pip install -e .
Configuration
The server requires the following environment variables:
GEMINI_API_KEY
: Your Gemini API key (required for Gemini models)OPENAI_API_KEY
: Your OpenAI API key (required for OpenAI models)REPO_PATH
: Path to your repository (defaults to current directory)YELLHORN_MCP_MODEL
: Model to use (defaults to "gemini-2.5-pro-preview-03-25"). Available options:- Gemini models: "gemini-2.5-pro-preview-03-25", "gemini-2.5-flash-preview-04-17"
- OpenAI models: "gpt-4o", "gpt-4o-mini", "o4-mini", "o3"
The server also requires the GitHub CLI (gh
) to be installed and authenticated.
Usage
Getting Started
VSCode/Cursor Setup
To configure Yellhorn MCP in VSCode or Cursor, create a .vscode/mcp.json
file at the root of your workspace with the following content:
{
"inputs": [
{
"type": "promptString",
"id": "gemini-api-key",
"description": "Gemini API Key"
}
],
"servers": {
"yellhorn-mcp": {
"type": "stdio",
"command": "/Users/msnidal/.pyenv/shims/yellhorn-mcp",
"args": [],
"env": {
"GEMINI_API_KEY": "${input:gemini-api-key}",
"REPO_PATH": "${workspaceFolder}"
}
}
}
}
Claude Code Setup
To configure Yellhorn MCP with Claude Code directly, add a root-level .mcp.json
file in your project with the following content:
{
"mcpServers": {
"yellhorn-mcp": {
"type": "stdio",
"command": "yellhorn-mcp",
"args": ["--model", "o3"],
"env": {}
}
}
}
Tools
create_workplan
Creates a GitHub issue with a detailed workplan based on the title and detailed description.
Input:
title
: Title for the GitHub issue (will be used as issue title and header)detailed_description
: Detailed description for the workplancodebase_reasoning
: (optional) Control whether AI enhancement is performed:"full"
: (default) Use AI to enhance the workplan with full codebase context"lsp"
: Use AI with lightweight codebase context (function/method signatures, class attributes and struct fields for Python and Go)"none"
: Skip AI enhancement, use the provided description as-is
debug
: (optional) If set totrue
, adds a comment to the issue with the full prompt used for generation
Output:
- JSON string containing:
issue_url
: URL to the created GitHub issueissue_number
: The GitHub issue number
get_workplan
Retrieves the workplan content (GitHub issue body) associated with a workplan.
Input:
issue_number
: The GitHub issue number for the workplan.
Output:
- The content of the workplan issue as a string
judge_workplan
Triggers an asynchronous code judgement comparing two git refs (branches or commits) against a workplan described in a GitHub issue. Creates a GitHub sub-issue with the judgement asynchronously after running (in the background).
Input:
issue_number
: The GitHub issue number for the workplan.base_ref
: Base Git ref (commit SHA, branch name, tag) for comparison. Defaults to 'main'.head_ref
: Head Git ref (commit SHA, branch name, tag) for comparison. Defaults to 'HEAD'.codebase_reasoning
: (optional) Control which codebase context is provided:"full"
: (default) Use full codebase context"lsp"
: Use lighter codebase context (only function signatures for Python and Go, plus full diff files)"none"
: Skip codebase context completely for fastest processing
debug
: (optional) If set totrue
, adds a comment to the sub-issue with the full prompt used for generation
Output:
- A confirmation message that the judgement task has been initiated
Resource Access
Yellhorn MCP also implements the standard MCP resource API to provide access to workplans:
list-resources
: Lists all workplans (GitHub issues with the yellhorn-mcp label)get-resource
: Retrieves the content of a specific workplan by issue number
These can be accessed via the standard MCP CLI commands:
# List all workplans
mcp list-resources yellhorn-mcp
# Get a specific workplan by issue number
mcp get-resource yellhorn-mcp 123
Development
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run tests with coverage report
pytest --cov=yellhorn_mcp --cov-report term-missing
CI/CD
The project uses GitHub Actions for continuous integration and deployment:
-
Testing: Runs automatically on pull requests and pushes to the main branch
- Linting with flake8
- Format checking with black
- Testing with pytest
-
Publishing: Automatically publishes to PyPI when a version tag is pushed
- Tag must match the version in pyproject.toml (e.g., v0.2.2)
- Requires a PyPI API token stored as a GitHub repository secret (PYPI_API_TOKEN)
To release a new version:
- Update version in pyproject.toml and yellhorn_mcp/__init__.py
- Update CHANGELOG.md with the new changes
- Commit changes:
git commit -am "Bump version to X.Y.Z"
- Tag the commit:
git tag vX.Y.Z
- Push changes and tag:
git push && git push --tags
For a history of changes, see the Changelog.
For more detailed instructions, see the Usage Guide.
License
MIT
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now