This is a Model Context Protocol (MCP) server implemented in Go, providing a tool to analyze Go pprof performance profiles.
What is ZephyrDeng pprof analyzer mcp
简体中文 | English
Pprof Analyzer MCP Server
This is a Model Context Protocol (MCP) server implemented in Go, providing a tool to analyze Go pprof performance profiles.
Features
analyze_pprof
Tool:- Analyzes the specified Go pprof file and returns serialized analysis results (e.g., Top N list or flame graph JSON).
- Supported Profile Types:
cpu
: Analyzes CPU time consumption during code execution to find hot spots.heap
: Analyzes the current memory usage (heap allocations) to find objects and functions with high memory consumption. Enhanced with object count, allocation site, and type information.goroutine
: Displays stack traces of all current goroutines, used for diagnosing deadlocks, leaks, or excessive goroutine usage.allocs
: Analyzes memory allocations (including freed ones) during program execution to locate code with frequent allocations. Provides detailed allocation site and object count information.mutex
: Analyzes contention on mutexes to find locks causing blocking. (Not yet implemented)block
: Analyzes operations causing goroutine blocking (e.g., channel waits, system calls). (Not yet implemented)
- Supported Output Formats:
text
,markdown
,json
(Top N list),flamegraph-json
(hierarchical flame graph data, default).text
,markdown
: Human-readable text or Markdown format.json
: Outputs Top N results in structured JSON format (implemented forcpu
,heap
,goroutine
,allocs
).flamegraph-json
: Outputs hierarchical flame graph data in JSON format, compatible with d3-flame-graph (implemented forcpu
,heap
,allocs
, default format). Output is compact.
- Configurable number of Top N results (
top_n
, defaults to 5, effective fortext
,markdown
,json
formats).
generate_flamegraph
Tool:- Uses
go tool pprof
to generate a flame graph (SVG format) for the specified pprof file, saves it to the specified path, and returns the path and SVG content. - Supported Profile Types:
cpu
,heap
,allocs
,goroutine
,mutex
,block
. - Requires the user to specify the output SVG file path.
- Important: This feature depends on Graphviz being installed.
- Uses
open_interactive_pprof
Tool (macOS Only):- Attempts to launch the
go tool pprof
interactive web UI in the background for the specified pprof file. Uses port:8081
by default ifhttp_address
is not provided. - Returns the Process ID (PID) of the background
pprof
process upon successful launch. - macOS Only: This tool will only work on macOS.
- Dependencies: Requires the
go
command to be available in the system's PATH. - Limitations: Errors from the background
pprof
process are not captured by the server. Temporary files downloaded from remote URLs are not automatically cleaned up until the process is terminated (either manually viadisconnect_pprof_session
or when the MCP server exits).
- Attempts to launch the
detect_memory_leaks
Tool:- Compares two heap profile snapshots to identify potential memory leaks.
- Analyzes memory growth by object type and allocation site.
- Provides detailed statistics on memory growth, including absolute and percentage changes.
- Configurable growth threshold and result limit.
- Helps identify memory leaks by comparing profiles taken at different points in time.
disconnect_pprof_session
Tool:- Attempts to terminate a background
pprof
process previously started byopen_interactive_pprof
, using its PID. - Sends an Interrupt signal first, then a Kill signal if Interrupt fails.
- Attempts to terminate a background
Installation (As a Library/Tool)
You can install this package directly using go install
:
go install github.com/ZephyrDeng/pprof-analyzer-mcp@latest
This will install the pprof-analyzer-mcp
executable to your $GOPATH/bin
or $HOME/go/bin
directory. Ensure this directory is in your system's PATH to run the command directly.
Building from Source
Ensure you have a Go environment installed (Go 1.18 or higher recommended).
In the project root directory (pprof-analyzer-mcp
), run:
go build
This will generate an executable file named pprof-analyzer-mcp
(or pprof-analyzer-mcp.exe
on Windows) in the current directory.
Using go install
(Recommended)
You can also use go install
to install the executable into your $GOPATH/bin
or $HOME/go/bin
directory. This allows you to run pprof-analyzer-mcp
directly from the command line (if the directory is added to your system's PATH environment variable).
# Installs the executable using the module path defined in go.mod
go install .
# Or directly using the GitHub path (recommended after publishing)
# go install github.com/ZephyrDeng/pprof-analyzer-mcp@latest
Running with Docker
Using Docker is a convenient way to run the server, as it bundles the necessary Graphviz dependency.
-
Build the Docker Image: In the project root directory (where the
Dockerfile
is located), run:docker build -t pprof-analyzer-mcp .
-
Run the Docker Container:
docker run -i --rm pprof-analyzer-mcp
- The
-i
flag keeps STDIN open, which is required for the stdio transport used by this MCP server. - The
--rm
flag automatically removes the container when it exits.
- The
-
Configure MCP Client for Docker: To connect your MCP client (like Roo Cline) to the server running inside Docker, update your
.roo/mcp.json
:{ "mcpServers": { "pprof-analyzer-docker": { "command": "docker run -i --rm pprof-analyzer-mcp" } } }
Make sure the
pprof-analyzer-mcp
image has been built locally before the client tries to run this command.
Releasing (Automated via GitHub Actions)
This project uses GoReleaser and GitHub Actions to automate the release process. Releases are triggered automatically when a Git tag matching the pattern v*
(e.g., v0.1.0
, v1.2.3
) is pushed to the repository.
Release Steps:
- Make Changes: Develop new features or fix bugs.
- Commit Changes: Commit your changes using Conventional Commits format (e.g.,
feat: ...
,fix: ...
). This is important for automatic changelog generation.git add . git commit -m "feat: Add awesome new feature" # or git commit -m "fix: Resolve issue #42"
- Push Changes: Push your commits to the main branch on GitHub.
git push origin main
- Create and Push Tag: When ready to release, create a new Git tag and push it to GitHub.
# Example: Create tag v0.1.0 git tag v0.1.0 # Push the tag to GitHub git push origin v0.1.0
- Automatic Release: Pushing the tag will trigger the
GoReleaser
GitHub Action defined in.github/workflows/release.yml
. This action will:- Build binaries for Linux, macOS, and Windows (amd64 & arm64).
- Generate a changelog based on Conventional Commits since the last tag.
- Create a new GitHub Release with the changelog and attach the built binaries and checksums as assets.
You can view the release workflow progress in the "Actions" tab of the GitHub repository.
Configuring the MCP Client
This server uses the stdio
transport protocol. You need to configure it in your MCP client (e.g., Roo Cline extension for VS Code).
Typically, this involves adding the following configuration to the .roo/mcp.json
file in your project root:
{
"mcpServers": {
"pprof-analyzer": {
"command": "pprof-analyzer-mcp"
}
}
}
Note: Adjust the command
value based on your build method (go build
or go install
) and the actual location of the executable. Ensure the MCP client can find and execute this command.
After configuration, reload or restart your MCP client, and it should automatically connect to the PprofAnalyzer
server.
Dependencies
-
Graphviz: The
generate_flamegraph
tool requires Graphviz to generate SVG flame graphs (thego tool pprof
command callsdot
when generating SVG). Ensure Graphviz is installed on your system and thedot
command is available in your system's PATH environment variable.Installing Graphviz:
- macOS (using Homebrew):
brew install graphviz
- Debian/Ubuntu:
sudo apt-get update && sudo apt-get install graphviz
- CentOS/Fedora:
sudo yum install graphviz # or sudo dnf install graphviz
- Windows (using Chocolatey):
choco install graphviz
- Other Systems: Refer to the Graphviz official download page.
- macOS (using Homebrew):
Usage Examples (via MCP Client)
Once the server is connected, you can call the analyze_pprof
and generate_flamegraph
tools using file://
, http://
, or https://
URIs for the profile file.
Example: Analyze CPU Profile (Text format, Top 5)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/cpu.pprof",
"profile_type": "cpu"
}
}
Example: Analyze Heap Profile (Markdown format, Top 10)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/heap.pprof",
"profile_type": "heap",
"top_n": 10,
"output_format": "markdown"
}
}
Example: Analyze Goroutine Profile (Text format, Top 5)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/goroutine.pprof",
"profile_type": "goroutine"
}
}
Example: Generate Flame Graph for CPU Profile
{
"tool_name": "generate_flamegraph",
"arguments": {
"profile_uri": "file:///path/to/your/cpu.pprof",
"profile_type": "cpu",
"output_svg_path": "/path/to/save/cpu_flamegraph.svg"
}
}
Example: Generate Flame Graph for Heap Profile (inuse_space)
{
"tool_name": "generate_flamegraph",
"arguments": {
"profile_uri": "file:///path/to/your/heap.pprof",
"profile_type": "heap",
"output_svg_path": "/path/to/save/heap_flamegraph.svg"
}
}
Example: Analyze CPU Profile (JSON format, Top 3)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/cpu.pprof",
"profile_type": "cpu",
"top_n": 3,
"output_format": "json"
}
}
Example: Analyze CPU Profile (Default Flame Graph JSON format)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/cpu.pprof",
"profile_type": "cpu"
// output_format defaults to "flamegraph-json"
}
}
Example: Analyze Heap Profile (Explicitly Flame Graph JSON format)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "file:///path/to/your/heap.pprof",
"profile_type": "heap",
"output_format": "flamegraph-json"
}
}
Example: Analyze Remote CPU Profile (from HTTP URL)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "https://example.com/profiles/cpu.pprof",
"profile_type": "cpu"
}
}
Example: Analyze Online CPU Profile (from GitHub Raw URL)
{
"tool_name": "analyze_pprof",
"arguments": {
"profile_uri": "https://raw.githubusercontent.com/google/pprof/refs/heads/main/profile/testdata/gobench.cpu",
"profile_type": "cpu",
"top_n": 5
}
}
Example: Generate Flame Graph for Online Heap Profile (from GitHub Raw URL)
{
"tool_name": "generate_flamegraph",
"arguments": {
"profile_uri": "https://raw.githubusercontent.com/google/pprof/refs/heads/main/profile/testdata/gobench.heap",
"profile_type": "heap",
"output_svg_path": "./online_heap_flamegraph.svg"
}
}
Example: Open Interactive Pprof UI for Online CPU Profile (macOS Only)
{
"tool_name": "open_interactive_pprof",
"arguments": {
"profile_uri": "https://raw.githubusercontent.com/google/pprof/refs/heads/main/profile/testdata/gobench.cpu"
// Optional: "http_address": ":8082" // Example of overriding the default port
}
}
Example: Detect Memory Leaks Between Two Heap Profiles
{
"tool_name": "detect_memory_leaks",
"arguments": {
"old_profile_uri": "file:///path/to/your/heap_before.pprof",
"new_profile_uri": "file:///path/to/your/heap_after.pprof",
"threshold": 0.05, // 5% growth threshold
"limit": 15 // Show top 15 potential leaks
}
}
Example: Disconnect a Pprof Session
{
"tool_name": "disconnect_pprof_session",
"arguments": {
"pid": 12345 // Replace 12345 with the actual PID returned by open_interactive_pprof
}
}
Future Improvements (TODO)
- Implement full analysis logic for
mutex
,block
profiles. - Implement
json
output format formutex
,block
profile types. - Set appropriate MIME types in MCP results based on
output_format
. - Add more robust error handling and logging level control.
Consider supporting remote pprof file URIs (e.g.,(Done)http://
,https://
).Implement full analysis logic for(Done)allocs
profiles.Implement(Done)json
output format forallocs
profile type.Add memory leak detection capabilities.(Done)- Add time-series analysis for memory profiles to track growth over multiple snapshots.
- Implement differential flame graphs to visualize changes between profiles.
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now