speakeasy api markdown sidecar mcp

speakeasy api markdown sidecar mcp avatar

by speakeasy-api

What is speakeasy api markdown sidecar mcp

Markdown Sidecar MCP Server

This provides a structured way to serve and access markdown documentation from an MCP server for NPM packages, Go Modules, or PyPi packages. It enables informed code generation by exposing these markdown files as resources or tools.

[!NOTE]
Note: Many PyPi packages do not have markdown docs exposed, so this library will also mount python help root docs by default.

This is designed to be executed from within a project directory where the requested packages are already installed locally. Access always stays within your local environments working directory.

Installation

npx -y markdown-sidecar-mcp

Arguments

  • workingDir: The working directory of your repo.
  • packageName: The name of the package or module to request
  • registry: Registry the package will be found in (npm, gomodules, or pypi)
  • docsSubDir: [OPTIONAL] The specific subdirectory to look for markdown docs in. Defaults to package root.
  • mcpPrimitive: [OPTIONAL] The MCP primitive to expose from the server (tool or resource). This defaults to tool, some clients do not currently support resources.

Cursor Installation Steps

Add the following server definition to your .cursor/mcp.json file:

{
  "mcpServers": {
    "sidecar": {
      "command": "npx",
      "args": [
        "-y", "--package", "markdown-sidecar-mcp",
        "--",
        "mcp", "start",
        "--workingDir", "{REPO_WORKING_DIR}",
        "--packageName", "{PACKAGE_NAME}",
        "--registry", "npm"
      ]
    }
  }
}

Development

# Install dependencies
npm i

# Build
npm run build

# Run with Bun
npm run build:mcp

Contributing

  1. Fork the repository
  2. Create your feature branch
  3. Commit your changes and push them up
  4. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Leave a Comment

Frequently Asked Questions

What is MCP?

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.

What are MCP Servers?

MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.

How do MCP Servers work?

MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.

Are MCP Servers secure?

Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.

Submit Your MCP Server

Share your MCP server with the community

Submit Now