Logfire MCP Server
by pydantic
The Logfire MCP Server is here! ๐
What is Logfire MCP Server
Logfire MCP Server
This repository contains a Model Context Protocol (MCP) server with tools that can access the OpenTelemetry traces and metrics you've sent to Logfire.
This MCP server enables LLMs to retrieve your application's telemetry data, analyze distributed traces, and make use of the results of arbitrary SQL queries executed using the Logfire APIs.
Available Tools
-
find_exceptions
- Get exception counts from traces grouped by file- Required arguments:
age
(int): Number of minutes to look back (e.g., 30 for last 30 minutes, max 7 days)
- Required arguments:
-
find_exceptions_in_file
- Get detailed trace information about exceptions in a specific file- Required arguments:
filepath
(string): Path to the file to analyzeage
(int): Number of minutes to look back (max 7 days)
- Required arguments:
-
arbitrary_query
- Run custom SQL queries on your OpenTelemetry traces and metrics- Required arguments:
query
(string): SQL query to executeage
(int): Number of minutes to look back (max 7 days)
- Required arguments:
-
get_logfire_records_schema
- Get the OpenTelemetry schema to help with custom queries- No required arguments
Setup
Install uv
The first thing to do is make sure uv
is installed, as uv
is used to run the MCP server.
For installation instructions, see the uv
installation docs.
If you already have an older version of uv
installed, you might need to update it with uv self update
.
Obtain a Logfire read token
In order to make requests to the Logfire APIs, the Logfire MCP server requires a "read token".
You can create one under the "Read Tokens" section of your project settings in Logfire: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens
[!IMPORTANT] Logfire read tokens are project-specific, so you need to create one for the specific project you want to expose to the Logfire MCP server.
Manually run the server
Once you have uv
installed and have a Logfire read token, you can manually run the MCP server using uvx
(which is provided by uv
).
You can specify your read token using the LOGFIRE_READ_TOKEN
environment variable:
LOGFIRE_READ_TOKEN=YOUR_READ_TOKEN uvx logfire-mcp
or using the --read-token
flag:
uvx logfire-mcp --read-token=YOUR_READ_TOKEN
[!NOTE]
If you are using Cursor, Claude Desktop, Cline, or other MCP clients that manage your MCP servers for you, you do NOT need to manually run the server yourself. The next section will show you how to configure these clients to make use of the Logfire MCP server.
Configuration with well-known MCP clients
Configure for Cursor
Create a .cursor/mcp.json
file in your project root:
{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp", "--read-token=YOUR-TOKEN"]
}
}
}
The Cursor doesn't accept the env
field, so you need to use the --read-token
flag instead.
Configure for Claude Desktop
Add to your Claude settings:
{
"command": ["uvx"],
"args": ["logfire-mcp"],
"type": "stdio",
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
}
}
Configure for Cline
Add to your Cline settings in cline_mcp_settings.json
:
{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp"],
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
},
"disabled": false,
"autoApprove": []
}
}
}
Customization - Base URL
By default, the server connects to the Logfire API at https://logfire-api.pydantic.dev
. You can override this by:
- Using the
--base-url
argument:
uvx logfire-mcp --base-url=https://your-logfire-instance.com
- Setting the environment variable:
LOGFIRE_BASE_URL=https://your-logfire-instance.com uvx logfire-mcp
Example Interactions
- Find all exceptions in traces from the last hour:
{
"name": "find_exceptions",
"arguments": {
"age": 60
}
}
Response:
[
{
"filepath": "app/api.py",
"count": 12
},
{
"filepath": "app/models.py",
"count": 5
}
]
- Get details about exceptions from traces in a specific file:
{
"name": "find_exceptions_in_file",
"arguments": {
"filepath": "app/api.py",
"age": 1440
}
}
Response:
[
{
"created_at": "2024-03-20T10:30:00Z",
"message": "Failed to process request",
"exception_type": "ValueError",
"exception_message": "Invalid input format",
"function_name": "process_request",
"line_number": "42",
"attributes": {
"service.name": "api-service",
"code.filepath": "app/api.py"
},
"trace_id": "1234567890abcdef"
}
]
- Run a custom query on traces:
{
"name": "arbitrary_query",
"arguments": {
"query": "SELECT trace_id, message, created_at, attributes->>'service.name' as service FROM records WHERE severity_text = 'ERROR' ORDER BY created_at DESC LIMIT 10",
"age": 1440
}
}
Examples of Questions for Claude
- "What exceptions occurred in traces from the last hour across all services?"
- "Show me the recent errors in the file 'app/api.py' with their trace context"
- "How many errors were there in the last 24 hours per service?"
- "What are the most common exception types in my traces, grouped by service name?"
- "Get me the OpenTelemetry schema for traces and metrics"
- "Find all errors from yesterday and show their trace contexts"
Getting Started
-
First, obtain a Logfire read token from: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens
-
Run the MCP server:
uvx logfire-mcp --read-token=YOUR_TOKEN
-
Configure your preferred client (Cursor, Claude Desktop, or Cline) using the configuration examples above
-
Start using the MCP server to analyze your OpenTelemetry traces and metrics!
Contributing
We welcome contributions to help improve the Logfire MCP server. Whether you want to add new trace analysis tools, enhance metrics querying functionality, or improve documentation, your input is valuable.
For examples of other MCP servers and implementation patterns, see the Model Context Protocol servers repository.
License
Logfire MCP is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License.
Leave a Comment
Comments section will be available soon. Stay tuned!
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
21st.dev Magic AI Agent
It's like v0 but in your Cursor/WindSurf/Cline. 21st dev Magic MCP server for working with your frontend like Magic
Requirements:
A Model Context Protocol Server for connecting with Adfin APIs
AgentQL MCP Server
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
AgentRPC
A universal RPC layer for AI agents. Connect to any function, any language, any framework, in minutes.
Aiven MCP Server
Model Context Protocol server for Aiven
IoTDB MCP Server
Apache IoTDB MCP Server
Apify Model Context Protocol (MCP) Server
Model Context Protocol (MCP) Server for Apify's Actors
APIMatic Validator MCP Server
APIMatic Validator MCP Server for validating OpenAPI specs via APIMatic's API with MCP
๐ Audiense Insights MCP Server
Audiense Insights MCP Server is a server based on the Model Context Protocol (MCP) that allows Claude and other MCP-compatible clients to interact with your Audiense Insights account
Bankless Onchain MCP Server
Bringing the bankless onchain API to MCP
Submit Your MCP Server
Share your MCP server with the community
Submit Now