What is randomm files db mcp
Files-DB-MCP: Vector Search for Code Projects
A local vector database system that provides LLM coding agents with fast, efficient search capabilities for software projects via the Message Control Protocol (MCP).
Features
- Zero Configuration - Auto-detects project structure with sensible defaults
- Real-Time Monitoring - Continuously watches for file changes
- Vector Search - Semantic search for finding relevant code
- MCP Interface - Compatible with Claude Code and other LLM tools
- Open Source Models - Uses Hugging Face models for code embeddings
Installation
Option 1: Clone and Setup (Recommended)
# Using SSH (recommended if you have SSH keys set up with GitHub)
git clone [email protected]:randomm/files-db-mcp.git ~/.files-db-mcp && bash ~/.files-db-mcp/install/setup.sh
# Using HTTPS (if you don't have SSH keys set up)
git clone https://github.com/randomm/files-db-mcp.git ~/.files-db-mcp && bash ~/.files-db-mcp/install/setup.sh
Option 2: Automated Installation Script
curl -fsSL https://raw.githubusercontent.com/randomm/files-db-mcp/main/install/install.sh | bash
Usage
After installation, run in any project directory:
files-db-mcp
The service will:
- Detect your project files
- Start indexing in the background
- Begin responding to MCP search queries immediately
Requirements
- Docker
- Docker Compose
Configuration
Files-DB-MCP works without configuration, but you can customize it with environment variables:
EMBEDDING_MODEL
- Change the embedding model (default: 'jinaai/jina-embeddings-v2-base-code' or project-specific model)FAST_STARTUP
- Set to 'true' to use a smaller model for faster startup (default: 'false')QUANTIZATION
- Enable/disable quantization (default: 'true')BINARY_EMBEDDINGS
- Enable/disable binary embeddings (default: 'false')IGNORE_PATTERNS
- Comma-separated list of files/dirs to ignore
First-Time Startup
On first run, Files-DB-MCP will download embedding models which may take several minutes depending on:
- The size of the selected model (300-500MB for high-quality models)
- Your internet connection speed
Subsequent startups will be much faster as models are cached in a persistent Docker volume. For faster initial startup, you can:
# Use a smaller, faster model (90MB)
EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 files-db-mcp
# Or enable fast startup mode
FAST_STARTUP=true files-db-mcp
Model Caching
Files-DB-MCP automatically persists downloaded embedding models, so you only need to download them once:
- Models are stored in a Docker volume called
model_cache
- This volume persists between container restarts and across different projects
- The cache is shared for all projects using Files-DB-MCP on your machine
- You don't need to download the model again for each project
Claude Code Integration
Add to your Claude Code configuration:
{
"mcpServers": {
"files-db-mcp": {
"command": "python",
"args": ["/path/to/src/claude_mcp_server.py", "--host", "localhost", "--port", "6333"]
}
}
}
For details, see Claude MCP Integration.
Documentation
- Installation Guide - Detailed setup instructions
- API Reference - Complete API documentation
- Configuration Guide - Configuration options
Repository Structure
/src
- Source code/tests
- Unit and integration tests/docs
- Documentation/scripts
- Utility scripts/install
- Installation scripts/.docker
- Docker configuration/config
- Configuration files/ai-assist
- AI assistance files
License
MIT License
Contributing
Contributions welcome! Please feel free to submit a pull request.
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
PostgreSQL MCP Server
A Model Context Protocol server that provides read-only access to PostgreSQL databases. This server enables LLMs to inspect database schemas and execute read-only queries.
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
Submit Your MCP Server
Share your MCP server with the community
Submit Now