ameeralns DeepResearchMCP

ameeralns DeepResearchMCP avatar

by ameeralns

Deep Research MCP is an intelligent research assistant built on the Model Context Protocol (MCP) that performs comprehensive, multi-step research on any topic.

What is ameeralns DeepResearchMCP

DeepResearch MCP

!DeepResearch Logo *TypeScript* *OpenAI* *Node.js*

๐Ÿ“š Overview

DeepResearch MCP is a powerful research assistant built on the Model Context Protocol (MCP). It conducts intelligent, iterative research on any topic through web searches, analysis, and comprehensive report generation.

๐ŸŒŸ Key Features

  • Intelligent Topic Exploration - Automatically identifies knowledge gaps and generates focused search queries
  • Comprehensive Content Extraction - Enhanced web scraping with improved content organization
  • Structured Knowledge Processing - Preserves important information while managing token usage
  • Scholarly Report Generation - Creates detailed, well-structured reports with executive summaries, analyses, and visualizations
  • Complete Bibliography - Properly cites all sources with numbered references
  • Adaptive Content Management - Automatically manages content to stay within token limits
  • Error Resilience - Recovers from errors and generates partial reports when full processing isn't possible

๐Ÿ› ๏ธ Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    โ”‚     โ”‚                 โ”‚     โ”‚                โ”‚
โ”‚  MCP Server Layer  โ”œโ”€โ”€โ”€โ”€โ–บโ”‚ Research Serviceโ”œโ”€โ”€โ”€โ”€โ–บโ”‚ Search Service โ”‚
โ”‚  (Tools & Prompts) โ”‚     โ”‚ (Session Mgmt)  โ”‚     โ”‚  (Firecrawl)   โ”‚
โ”‚                    โ”‚     โ”‚                 โ”‚     โ”‚                โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                                     โ”‚
                                     โ–ผ
                           โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
                           โ”‚                 โ”‚
                           โ”‚  OpenAI Service โ”‚
                           โ”‚ (Analysis/Rpt)  โ”‚
                           โ”‚                 โ”‚
                           โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿ’ป Installation

Prerequisites

  • Node.js 18 or higher
  • OpenAI API key
  • Firecrawl API key

Setup Steps

  1. Clone the repository

    git clone <repository-url>
    cd deep-research-mcp
    
  2. Install dependencies

    npm install
    
  3. Configure environment variables

    cp .env.example .env
    

    Edit the .env file and add your API keys:

    OPENAI_API_KEY=sk-your-openai-api-key
    FIRECRAWL_API_KEY=your-firecrawl-api-key
    
  4. Build the project

    npm run build
    

๐Ÿš€ Usage

Running the MCP Server

Start the server on stdio for MCP client connections:

npm start

Using the Example Client

Run research on a specific topic with a specified depth:

npm run client "Your research topic" 3

Parameters:

  • First argument: Research topic or query
  • Second argument: Research depth (number of iterations, default: 2)
  • Third argument (optional): "complete" to use the complete-research tool (one-step process)

Example:

npm run client "the impact of climate change on coral reefs" 3 complete

Example Output

The DeepResearch MCP will produce a comprehensive report that includes:

  • Executive Summary - Concise overview of the research findings
  • Introduction - Context and importance of the research topic
  • Methodology - Description of the research approach
  • Comprehensive Analysis - Detailed examination of the topic
  • Comparative Analysis - Visual comparison of key aspects
  • Discussion - Interpretation of findings and implications
  • Limitations - Constraints and gaps in the research
  • Conclusion - Final insights and recommendations
  • Bibliography - Complete list of sources with URLs

๐Ÿ”ง MCP Integration

Available MCP Resources

Resource Path Description
research://state/{sessionId} Access the current state of a research session
research://findings/{sessionId} Access the collected findings for a session

Available MCP Tools

Tool Name Description Parameters
initialize-research Start a new research session query: string, depth: number
execute-research-step Execute the next research step sessionId: string
generate-report Create a final report sessionId: string, timeout: number (optional)
complete-research Execute the entire research process query: string, depth: number, timeout: number (optional)

๐Ÿ–ฅ๏ธ Claude Desktop Integration

DeepResearch MCP can be integrated with Claude Desktop to provide direct research capabilities to Claude.

Configuration Steps

  1. Copy the sample configuration

    cp claude_desktop_config_sample.json ~/path/to/claude/desktop/config/directory/claude_desktop_config.json
    
  2. Edit the configuration file

    Update the path to point to your installation of deep-research-mcp and add your API keys:

    {
      "mcpServers": {
        "deep-research": {
          "command": "node",
          "args": [
            "/absolute/path/to/your/deep-research-mcp/dist/index.js"
          ],
          "env": {
            "FIRECRAWL_API_KEY": "your-firecrawler-api-key",
            "OPENAI_API_KEY": "your-openai-api-key"
          }
        }
      }
    }
    
  3. Restart Claude Desktop

    After saving the configuration, restart Claude Desktop for the changes to take effect.

  4. Using with Claude Desktop

    Now you can ask Claude to perform research using commands like:

    Can you research the impact of climate change on coral reefs and provide a detailed report?
    

๐Ÿ“‹ Sample Client Code

import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";

async function main() {
  // Connect to the server
  const transport = new StdioClientTransport({
    command: "node",
    args: ["dist/index.js"]
  });

  const client = new Client({ name: "deep-research-client", version: "1.0.0" });
  await client.connect(transport);

  // Initialize research
  const initResult = await client.callTool({
    name: "initialize-research",
    arguments: {
      query: "The impact of artificial intelligence on healthcare",
      depth: 3
    }
  });
  
  // Parse the response to get sessionId
  const { sessionId } = JSON.parse(initResult.content[0].text);
  
  // Execute steps until complete
  let currentDepth = 0;
  while (currentDepth < 3) {
    const stepResult = await client.callTool({
      name: "execute-research-step",
      arguments: { sessionId }
    });
    
    const stepInfo = JSON.parse(stepResult.content[0].text);
    currentDepth = stepInfo.currentDepth;
    
    console.log(`Completed step ${stepInfo.currentDepth}/${stepInfo.maxDepth}`);
  }
  
  // Generate final report with timeout
  const report = await client.callTool({
    name: "generate-report",
    arguments: { 
      sessionId,
      timeout: 180000 // 3 minutes timeout
    }
  });
  
  console.log("Final Report:");
  console.log(report.content[0].text);
}

main().catch(console.error);

๐Ÿ” Troubleshooting

Common Issues

  • Token Limit Exceeded: For very large research topics, you may encounter OpenAI token limit errors. Try:

    • Reducing the research depth
    • Using more specific queries
    • Breaking complex topics into smaller sub-topics
  • Timeout Errors: For complex research, the process may time out. Solutions:

    • Increase the timeout parameters in tool calls
    • Use the complete-research tool with a longer timeout
    • Process research in smaller chunks
  • API Rate Limits: If you encounter rate limit errors from OpenAI or Firecrawl:

    • Implement a delay between research steps
    • Use an API key with higher rate limits
    • Retry with exponential backoff

๐Ÿ“ License

ISC

๐Ÿ™ Acknowledgements

  • Built with Model Context Protocol
  • Powered by OpenAI and Firecrawl

Leave a Comment

Frequently Asked Questions

What is MCP?

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.

What are MCP Servers?

MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.

How do MCP Servers work?

MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.

Are MCP Servers secure?

Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.

Related MCP Servers

Brave Search MCP avatar

Brave Search MCP

Integrate Brave Search capabilities into Claude through MCP. Enables real-time web searches with privacy-focused results and comprehensive web coverage.

searchapiofficial
chrisdoc hevy mcp avatar

chrisdoc hevy mcp

mcp
sylphlab pdf reader mcp avatar

sylphlab pdf reader mcp

An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.

pdf-parsetypescriptnodejs
aashari mcp server atlassian bitbucket avatar

aashari mcp server atlassian bitbucket

Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.

atlassianrepositorymcp
aashari mcp server atlassian confluence avatar

aashari mcp server atlassian confluence

Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.

atlassianmcpconfluence
prisma prisma avatar

prisma prisma

Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB

cockroachdbgomcp
Zzzccs123 mcp sentry avatar

Zzzccs123 mcp sentry

mcp sentry for typescript sdk

mcptypescript
zhuzhoulin dify mcp server avatar

zhuzhoulin dify mcp server

mcp
zhongmingyuan mcp my mac avatar

zhongmingyuan mcp my mac

mcp
zhixiaoqiang desktop image manager mcp avatar

zhixiaoqiang desktop image manager mcp

MCP ๆœๅŠกๅ™จ๏ผŒ็”จไบŽ็ฎก็†ๆกŒ้ขๅ›พ็‰‡ใ€ๆŸฅ็œ‹่ฏฆๆƒ…ใ€ๅŽ‹็ผฉใ€็งปๅŠจ็ญ‰๏ผˆๅฎŒๅ…จ่ฎฉTraeๅฎž็Žฐ๏ผ‰

mcp

Submit Your MCP Server

Share your MCP server with the community

Submit Now