What is ChiragPatankar AI Customer Support Bot MCP Server
AI Customer Support Bot - MCP Server
A Model Context Protocol (MCP) server that provides AI-powered customer support using Cursor AI and Glama.ai integration.
Features
- Real-time context fetching from Glama.ai
- AI-powered response generation with Cursor AI
- Batch processing support
- Priority queuing
- Rate limiting
- User interaction tracking
- Health monitoring
- MCP protocol compliance
Prerequisites
- Python 3.8+
- PostgreSQL database
- Glama.ai API key
- Cursor AI API key
Installation
- Clone the repository:
git clone <repository-url>
cd <repository-name>
- Create and activate a virtual environment:
python -m venv venv
source venv`/bin/activate` # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Create a
.env
file based on.env.example
:
cp .env.example .env
- Configure your
.env
file with your credentials:
# API Keys
GLAMA_API_KEY=your_glama_api_key_here
CURSOR_API_KEY=your_cursor_api_key_here
# Database
DATABASE_URL=postgresql://user:password@localhost/customer_support_bot
# API URLs
GLAMA_API_URL=https://api.glama.ai/v1
# Security
SECRET_KEY=your_secret_key_here
# MCP Server Configuration
SERVER_NAME="AI Customer Support Bot"
SERVER_VERSION="1.0.0"
API_PREFIX="/mcp"
MAX_CONTEXT_RESULTS=5
# Rate Limiting
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_PERIOD=60
# Logging
LOG_LEVEL=INFO
- Set up the database:
# Create the database
createdb customer_support_bot
# Run migrations (if using Alembic)
alembic upgrade head
Running the Server
Start the server:
python app.py
The server will be available at http://localhost:8000
API Endpoints
1. Root Endpoint
GET /
Returns basic server information.
2. MCP Version
GET /mcp/version
Returns supported MCP protocol versions.
3. Capabilities
GET /mcp/capabilities
Returns server capabilities and supported features.
4. Process Request
POST /mcp/process
Process a single query with context.
Example request:
curl -X POST http://localhost:8000/mcp/process \
-H "Content-Type: application/json" \
-H "X-MCP-Auth: your-auth-token" \
-H "X-MCP-Version: 1.0" \
-d '{
"query": "How do I reset my password?",
"priority": "high",
"mcp_version": "1.0"
}'
5. Batch Processing
POST /mcp/batch
Process multiple queries in a single request.
Example request:
curl -X POST http://localhost:8000/mcp/batch \
-H "Content-Type: application/json" \
-H "X-MCP-Auth: your-auth-token" \
-H "X-MCP-Version: 1.0" \
-d '{
"queries": [
"How do I reset my password?",
"What are your business hours?",
"How do I contact support?"
],
"mcp_version": "1.0"
}'
6. Health Check
GET /mcp/health
Check server health and service status.
Rate Limiting
The server implements rate limiting with the following defaults:
- 100 requests per 60 seconds
- Rate limit information is included in the health check endpoint
- Rate limit exceeded responses include reset time
Error Handling
The server returns structured error responses in the following format:
{
"code": "ERROR_CODE",
"message": "Error description",
"details": {
"timestamp": "2024-02-14T12:00:00Z",
"additional_info": "value"
}
}
Common error codes:
RATE_LIMIT_EXCEEDED
: Rate limit exceededUNSUPPORTED_MCP_VERSION
: Unsupported MCP versionPROCESSING_ERROR
: Error processing requestCONTEXT_FETCH_ERROR
: Error fetching context from Glama.aiBATCH_PROCESSING_ERROR
: Error processing batch request
Development
Project Structure
.
├── app.py # Main application file
├── database.py # Database configuration
├── middleware.py # Middleware (rate limiting, validation)
├── models.py # Database models
├── mcp_config.py # MCP-specific configuration
├── requirements.txt # Python dependencies
└── .env # Environment variables
Adding New Features
- Update
mcp_config.py
with new configuration options - Add new models in
models.py
if needed - Create new endpoints in
app.py
- Update capabilities endpoint to reflect new features
Security
- All MCP endpoints require authentication via
X-MCP-Auth
header - Rate limiting is implemented to prevent abuse
- Database credentials should be kept secure
- API keys should never be committed to version control
Monitoring
The server provides health check endpoints for monitoring:
- Service status
- Rate limit usage
- Connected services
- Processing times
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
Flowchart
!Flowchart
Verification Badge
*
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For support, please create an issue in the repository or contact the development team.
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now