PromptLab transforms basic user queries into optimized prompts for AI systems --> Built using MCP
What is iRahulPandey PromptLab
PromptLab: AI Query Enhancement with MLflow Integration
PromptLab is an intelligent system that transforms basic user queries into optimized prompts for AI systems using MLflow Prompt Registry. It dynamically matches user requests to the most appropriate prompt template and applies it with extracted parameters.
🔍 Overview
PromptLab combines MLflow Prompt Registry with dynamic prompt matching to create a powerful, flexible system for prompt engineering:
- Centralized Prompt Management: Store, version, and manage prompts in MLflow
- Dynamic Matching: Intelligently match user queries to the best prompt template
- Version Control: Track prompt history with production and archive aliases
- Extensible: Easily add new prompt types without code changes
🏗️ Architecture
The system consists of three main components:
- Prompt Registry (
register_prompts.py
) - Tool for registering and managing prompts in MLflow - Server (
promptlab_server.py
) - Server with dynamic prompt matching and LangGraph workflow - Client (
promptlab_client.py
) - Lightweight client for processing user queries
Workflow Process
- Prompt Registration: Register prompt templates in MLflow with versioning and aliasing
- Prompt Loading: Server loads all available prompts from MLflow at startup
- Query Submission: User submits a natural language query via the client
- Intelligent Matching: LLM analyzes the query and selects the most appropriate prompt template
- Parameter Extraction: System extracts required parameters from the query
- Template Application: Selected template is applied with extracted parameters
- Validation & Adjustment: Enhanced prompt is validated and adjusted if needed
- Response Generation: Optimized prompt produces a high-quality response
📂 Code Structure
promptlab/
├── promptlab_server.py # Main server with LangGraph workflow
├── promptlab_client.py # Client for processing queries
├── register_prompts.py # MLflow prompt management tool
├── requirements.txt # Project dependencies
├── advanced_prompts.json # Additional prompt templates
└── README.md # Project documentation
Core Components:
register_prompts.py
- Purpose: Manages prompts in MLflow Registry
- Key Functions:
register_prompt()
: Register a new prompt or versionupdate_prompt()
: Update an existing prompt (archives previous production)list_prompts()
: List all registered promptsregister_from_file()
: Register multiple prompts from JSONregister_sample_prompts()
: Initialize with standard prompts
promptlab_server.py
- Purpose: Processes queries using LangGraph workflow
- Key Components:
load_all_prompts()
: Loads prompts from MLflowmatch_prompt()
: Matches queries to appropriate templatesenhance_query()
: Applies selected templatevalidate_query()
: Validates enhanced queriesLangGraph workflow
: Orchestrates the query enhancement process
promptlab_client.py
- Purpose: Provides user interface to the service
- Key Features:
- Process queries with enhanced prompts
- List available prompts
- Display detailed prompt matching information
🚀 Getting Started
Prerequisites
- Python 3.12
- Dependencies in
requirements.txt
- OpenAI API key for LLM capabilities
Installation
# Clone the repository
git clone https://github.com/iRahulPandey/PromptLab.git
cd PromptLab
# Install dependencies
pip install -r requirements.txt
# Set environment variables
export OPENAI_API_KEY="your-openai-api-key"
Registering Prompts
Before using PromptLab, you need to register prompts in MLflow:
# Register sample prompts (essay, email, technical, creative)
python register_prompts.py register-samples
# Register additional prompt types (recommended)
python register_prompts.py register-file --file advanced_prompts.json
# Verify registered prompts
python register_prompts.py list
Running the Server
# Start the server
python promptlab_server.py
Using the Client
# Process a query
python promptlab_client.py "Write a blog post about machine learning"
# List available prompts
python promptlab_client.py --list
# Enable verbose output
python promptlab_client.py --verbose "Create a presentation on climate change"
📋 Prompt Management
Available Prompt Types
PromptLab supports a wide range of prompt types:
Prompt Type | Description | Example Use Case |
---|---|---|
essay_prompt | Academic writing | Research papers, analyses |
email_prompt | Email composition | Professional communications |
technical_prompt | Technical explanations | Concepts, technologies |
creative_prompt | Creative writing | Stories, poems, fiction |
code_prompt | Code generation | Functions, algorithms |
summary_prompt | Content summarization | Articles, documents |
analysis_prompt | Critical analysis | Data, texts, concepts |
qa_prompt | Question answering | Context-based answers |
social_media_prompt | Social media content | Platform-specific posts |
blog_prompt | Blog article writing | Online articles |
report_prompt | Formal reports | Business, technical reports |
letter_prompt | Formal letters | Cover, recommendation letters |
presentation_prompt | Presentation outlines | Slides, talks |
review_prompt | Reviews | Products, media, services |
comparison_prompt | Comparisons | Products, concepts, options |
instruction_prompt | How-to guides | Step-by-step instructions |
custom_prompt | Customizable template | Specialized use cases |
Registering New Prompts
You can register new prompts in several ways:
1. From Command Line
python register_prompts.py register \
--name "new_prompt" \
--template "Your template with {{ variables }}" \
--message "Initial version" \
--tags '{"type": "custom", "task": "specialized"}'
2. From a Template File
# Create a text file with your template
echo "Template content with {{ variables }}" > template.txt
# Register using the file
python register_prompts.py register \
--name "long_prompt" \
--template template.txt \
--message "Complex template"
3. From a JSON File
Create a JSON file with multiple prompts:
{
"prompts": [
{
"name": "prompt_name",
"template": "Template with {{ variables }}",
"commit_message": "Description",
"tags": {"type": "category", "task": "purpose"}
}
]
}
Then register them:
python register_prompts.py register-file --file your_prompts.json
Updating Existing Prompts
When you update an existing prompt, the system automatically:
- Archives the previous production version
- Sets the new version as production
python register_prompts.py update \
--name "essay_prompt" \
--template "New improved template with {{ variables }}" \
--message "Enhanced clarity and structure"
Viewing Prompt Details
# List all prompts
python register_prompts.py list
# View detailed information about a specific prompt
python register_prompts.py details --name "essay_prompt"
🛠️ Advanced Usage
Template Variables
Templates use variables in {{ variable }}
format:
Write a {{ formality }} email to my {{ recipient_type }} about {{ topic }} that includes:
- A clear subject line
- Appropriate greeting
...
When matching a query, the system automatically extracts values for these variables.
Production and Archive Aliases
Each prompt can have different versions with aliases:
- production: The current active version (used by default)
- archived: Previous production versions
This allows for:
- Rolling back to previous versions if needed
- Tracking the history of prompt changes
Custom Prompt Registration
For specialized use cases, you can create highly customized prompts:
python register_prompts.py register \
--name "specialized_prompt" \
--template "You are a {{ role }} with expertise in {{ domain }}. Create a {{ document_type }} about {{ topic }} that demonstrates {{ quality }}." \
--message "Specialized template" \
--tags '{"type": "custom", "task": "specialized", "domain": "finance"}'
🔧 Troubleshooting
No Matching Prompt Found
If the system can't match a query to any prompt template, it will:
- Log a message that no match was found
- Use the original query without enhancement
- Still generate a response
You can add more diverse prompt templates to improve matching.
LLM Connection Issues
If the LLM service is unavailable, the system falls back to:
- Keyword-based matching for prompt selection
- Simple parameter extraction
- Basic prompt enhancement
This ensures the system remains functional even without LLM access.
Leave a Comment
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
chrisdoc hevy mcp
sylphlab pdf reader mcp
An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.
aashari mcp server atlassian bitbucket
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.
aashari mcp server atlassian confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
prisma prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB
Zzzccs123 mcp sentry
mcp sentry for typescript sdk
zhuzhoulin dify mcp server
zhongmingyuan mcp my mac
zhixiaoqiang desktop image manager mcp
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
zhixiaoqiang antd components mcp
An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询
Submit Your MCP Server
Share your MCP server with the community
Submit Now