MCP Deep Research
简体中文 | English
Overview
MCP Deep Research is a tool that allows you to search the web for information. It is built with the Model Context Protocol and the Tavily API.
Configuration
{
"mcpServers": {
"deep-research": {
"command": "npx",
"args": ["-y", "mcp-deep-research@latest"],
"env": {
"TAVILY_API_KEY": "your_tavily_api_key", // Required
"MAX_SEARCH_KEYWORDS": "5", // Optional, default 5
"MAX_PLANNING_ROUNDS": "5" // Optional, default 5
}
}
}
}
The tool can be configured using the following environment variables:
TAVILY_API_KEY: The API key for the Tavily API.MAX_SEARCH_KEYWORDS: The maximum number of search keywords to use.MAX_PLANNING_ROUNDS: The maximum number of planning rounds to use.TAVILY_HTTP_PROXY/TAVILY_HTTPS_PROXY: The proxy address for the Tavily API.
Use with Smithery
Install via Smithery, compatible with Claude Desktop client:
npx -y @smithery/cli install @baranwang/mcp-deep-research --client claude
Compatibility Notice
This MCP server is optimized for prompt-based MCP clients. For MCP clients implemented using function calling mechanisms, the performance and results may not be optimal.
Verified prompt-based MCP clients:
- Claude Desktop
- Cursor
- Cline
- ChatWise
Verified function calling-based MCP clients:
- Cherry Studio
Deep Research
Project Details
- baranwang/mcp-deep-research
- Last Updated: 4/13/2025
Recomended MCP Servers
Shell and coding agent on claude desktop app
Figma MCP Server with full API functionality
A Model Context Protocol server for searching and analyzing arXiv papers
这是一个基于MCP框架的智能客服系统示例项目,用于演示如何构建和部署智能客服应用
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.
An MCP server that checks weekly report submissions in a Google Sheet
The Quickchat AI MCP server
Model Context Protocol server to let LLMs write and execute matlab scripts
A Model Context Protocol Server for AI models to interface with the Tesla Fleet API.





