Inbox Zero MCP Server
by elie222
What is Inbox Zero MCP Server
About
There are two parts to Inbox Zero:
- An AI email assistant that helps you spend less time on email.
- Open source AI email client.
If you're looking to contribue to the project, the email client is the best place to do this.
Thanks to Vercel for sponsoring Inbox Zero in support of open-source software.
Features
- AI Personal Assistant: Manages your email for you based on a plain text prompt file. It can take any action a human assistant can take on your behalf (Draft reply, Label, Archive, Reply, Forward, Mark Spam, and even call a webhook).
- Reply Zero: Track emails that need your reply and those awaiting responses.
- Smart Categories: Categorize everyone that's ever emailed you.
- Bulk Unsubscriber: Quickly unsubscribe from emails you never read in one-click.
- Cold Email Blocker: Automatically block cold emails.
- Email Analytics: Track your email activity with daily, weekly, and monthly stats.
Learn more in our docs.
Feature Screenshots
![]() |
![]() |
---|---|
AI Assistant | Reply Zero |
![]() |
![]() |
Gmail client | Bulk Unsubscriber |
Demo Video
Built with
Feature Requests
To request a feature open a GitHub issue. If you don't have a GitHub account you can request features here. Or join our Discord.
Getting Started for Developers
We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.
Contributing to the project
You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.
ARCHITECTURE.md explains the architecture of the project (LLM generated).
Requirements
- Node.js >= 18.0.0
- pnpm >= 8.6.12
- Docker desktop (optional)
Setup
Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.
The external services that are required are:
- Google OAuth
- Google PubSub - see set up instructions below
You also need to set an LLM, but you can use a local one too:
We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.
You can run Postgres & Redis locally using docker-compose
docker-compose up -d # -d will run the services in the background
Create your own .env
file:
cp apps/web/.env.example apps/web/.env
cd apps/web
pnpm install
Set the environment variables in the newly created .env
. You can see a list of required variables in: apps/web/env.ts
.
The required environment variables:
NEXTAUTH_SECRET
-- can be any random string (try usingopenssl rand -hex 32
for a quick secure random string)GOOGLE_CLIENT_ID
-- Google OAuth client ID. More info hereGOOGLE_CLIENT_SECRET
-- Google OAuth client secret. More info hereGOOGLE_ENCRYPT_SECRET
-- Secret key for encrypting OAuth tokens (try usingopenssl rand -hex 32
for a secure key)GOOGLE_ENCRYPT_SALT
-- Salt for encrypting OAuth tokens (try usingopenssl rand -hex 16
for a secure salt)UPSTASH_REDIS_URL
-- Redis URL from Upstash. (can be empty if you are using Docker Compose)UPSTASH_REDIS_TOKEN
-- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300
or lower. See Vercel limits for different plans here.
To run the migrations:
pnpm prisma migrate dev
To run the app locally for development (slower):
pnpm run dev
Or from the project root:
turbo dev
To build and run the app locally in production mode (faster):
pnpm run build
pnpm start
Open http://localhost:3000 to view the app in your browser.
To upgrade yourself, make yourself an admin in the .env
: [email protected]
Then upgrade yourself at: http://localhost:3000/admin.
Supported LLMs
For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:
OLLAMA_BASE_URL=http://localhost:11434/api
NEXT_PUBLIC_OLLAMA_MODEL=phi3
Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api
as the base URL. You might also need to set OLLAMA_HOST
to 0.0.0.0
in the Ollama configuration file.
You can select the model you wish to use in the app on the /settings
page of the app.
Setting up Google OAuth and Gmail API
You need to enable these scopes in the Google Cloud Console:
https://www.googleapis.com/auth/userinfo.profile
https://www.googleapis.com/auth/userinfo.email
https://www.googleapis.com/auth/gmail.modify
https://www.googleapis.com/auth/gmail.settings.basic
https://www.googleapis.com/auth/contacts
Set up push notifications via Google PubSub to handle emails in real time
Follow instructions here.
Set env var GOOGLE_PUBSUB_TOPIC_NAME
.
When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN
or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN
where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN
in your .env
file to be the value of TOKEN
.
To run in development ngrok can be helpful:
ngrok http 3000
# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000
And then update the webhook endpoint in the Google PubSub subscriptions dashboard.
To start watching emails visit: /api/google/watch/all
Watching for email updates
Set a cron job to run these: The Google watch is necessary. The Resend one is optional.
"crons": [
{
"path": "/api/google/watch/all",
"schedule": "0 1 * * *"
},
{
"path": "/api/resend/summary/all",
"schedule": "0 16 * * 1"
}
]
Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json
. Open to PRs if you find a fix for that.
Leave a Comment
Comments section will be available soon. Stay tuned!
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
21st.dev Magic AI Agent
It's like v0 but in your Cursor/WindSurf/Cline. 21st dev Magic MCP server for working with your frontend like Magic
Requirements:
A Model Context Protocol Server for connecting with Adfin APIs
AgentQL MCP Server
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
AgentRPC
A universal RPC layer for AI agents. Connect to any function, any language, any framework, in minutes.
Aiven MCP Server
Model Context Protocol server for Aiven
IoTDB MCP Server
Apache IoTDB MCP Server
Apify Model Context Protocol (MCP) Server
Model Context Protocol (MCP) Server for Apify's Actors
APIMatic Validator MCP Server
APIMatic Validator MCP Server for validating OpenAPI specs via APIMatic's API with MCP
๐ Audiense Insights MCP Server
Audiense Insights MCP Server is a server based on the Model Context Protocol (MCP) that allows Claude and other MCP-compatible clients to interact with your Audiense Insights account
Bankless Onchain MCP Server
Bringing the bankless onchain API to MCP
Submit Your MCP Server
Share your MCP server with the community
Submit Now