mcp-server-chatsum
This MCP Server is used to summarize your chat messages.
中文说明

Before you start
move to chatbot directory, follow the README to setup the chat database.
start chatbot to save your chat messages.
Features
Resources
Tools
query_chat_messages- Query chat messages- Query chat messages with given parameters
- Summarize chat messages based on the query prompt
Prompts
Development
- Set up environment variables:
create .env file in the root directory, and set your chat database path.
CHAT_DB_PATH=path-to/chatbot/data/chat.db
- Install dependencies:
pnpm install
Build the server:
pnpm build
For development with auto-rebuild:
pnpm watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"mcp-server-chatsum": {
"command": "path-to/bin/node",
"args": ["path-to/mcp-server-chatsum/build/index.js"],
"env": {
"CHAT_DB_PATH": "path-to/mcp-server-chatsum/chatbot/data/chat.db"
}
}
}
}
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
pnpm inspector
The Inspector will provide a URL to access debugging tools in your browser.
Community
- MCP Server Telegram
- MCP Server Discord
About the author
- idoubi
Chat Summary Server
Project Details
- chatmcp/mcp-server-chatsum
- Last Updated: 4/21/2025
Recomended MCP Servers
Sample MCP Server for Dify AI
Two Truths and a Twist: The world's first Model Context Protocol game
MCP Hyperliquid (https://app.hyperliquid.xyz) server
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MCP for reverse engineering
MCP server for Cursor that leverages Gemini's much larger context window to enhance the capabilities of the AI...
MCP server enabling persistent memory for Claude through a local knowledge graph - fork focused on local development
An MCP server implementation enabling LLMs to work with new APIs and frameworks
This is a MCP server I built to interact with my hybrid graph rag db.
MCP server that uses arxiv-to-prompt to fetch and process arXiv LaTeX sources for precise interpretation of mathematical expressions...





