MCP Client Chatbot
English | 한국어
MCP Client Chatbot is a versatile chat interface that supports various AI model providers like OpenAI, Anthropic, Gemini, and Ollama.
It is also the first known speech-based chatbot with integrated MCP Server support, enabling real-time multimodal interactions.
Our mission is to build the most powerful tool-using chatbot, combining the best of language models and tool orchestration.
We aim to create diverse UX and features that allow LLMs to actively use tools — such as @tool
mentions for direct invocation,
enabling speech-based chat to access and use MCP server tools, quick tool presets for fast selection,
and the upcoming workflow with tools feature for multi-step automation.
Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. Leverage the power of Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.
Open Source Project
MCP Client Chatbot is a 100% community-driven open source project.
Table of Contents
- MCP Client Chatbot
- Table of Contents
- Demo
Browser Automation with Playwright MCP
Quick Tool Mentions (
@
)Adding MCP Servers Easily
Standalone Tool Testing
Built-in Chart Tools
Key Features
- Quick Start (Local Version)
- Quick Start (Docker Compose Version)
- Environment Variables
- MCP Server Setup
- Quick Start (Local Version)
Tips & Guides
- Docker Hosting Guide:
- Vercel Hosting Guide:
- OAuth Setup Guide (Google & GitHub):
- Project Feature with MCP Server:
Roadmap: Next Features
Deployment & Hosting
File & Image
MCP Workflow
Built-in Tools & UX
LLM Code Write (with Daytona)
Contributing
Join Our Discord
Demo
Here are some quick examples of how you can use MCP Client Chatbot:
Browser Automation with Playwright MCP
Example: Control a web browser using Microsoft’s playwright-mcp tool.
Sample prompt:
Please go to GitHub and visit the cgoinglove profile.
Open the mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.
Quick Tool Mentions (@
)
Quickly call any registered MCP tool during chat by typing @toolname
.
No need to memorize — just type @
and pick from the list!
You can also control how tools are used with the new Tool Choice Mode:
- Auto: Tools are automatically called by the model when needed.
- Manual: The model will ask for your permission before calling any tool.
- None: Disables all tool usage.
Toggle modes anytime with the shortcut ⌘P
.
Adding MCP Servers Easily
Add new MCP servers easily through the UI, and start using new tools without restarting the app.
Standalone Tool Testing
Test MCP tools independently of chat sessions to simplify development and debugging.
Built-in Chart Tools
Visualize chatbot responses as pie, bar, or line charts using the built-in tool — perfect for quick data insight during conversations.
Key Features
100% Local Execution: Run directly on your PC or server without complex deployment, fully utilizing and controlling your computing resources.
Multiple AI Model Support: Flexibly switch between providers like OpenAI, Anthropic, Google AI, and Ollama.
Real-time voice chat powered by MCP Server: Currently supports OpenAI provider (Gemini support coming soon)
Powerful MCP Integration: Seamlessly connect external tools (browser automation, database operations, etc.) into chat via Model Context Protocol.
Standalone Tool Tester: Test and debug MCP tools separately from the main chat interface.
Intuitive Mentions + Tool Control: Trigger tools with
@
, and control when they’re used viaAuto
/Manual
/None
modes.Easy Server Setup: Configure MCP connections via UI or
.mcp-config.json
file.Markdown UI: Communicate in a clean, readable markdown-based interface.
Custom MCP Server Support: Modify the built-in MCP server logic or create your own.
Built-in Chart Tools: Generate pie, bar, and line charts directly in chat with natural prompts.
Easy Deployment: with vercel support baked in it makes an easily accesible chatbot.
Run anywhere: Easily launch with Docker Compose—just build the image and run.
This project uses pnpm as the recommended package manager.
# If you don't have pnpm:
npm install -g pnpm
Quick Start (Local Version) 
# 1. Install dependencies
pnpm i
# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.
# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg
# 4. Run database migrations
pnpm db:migrate
# 5. Start the development server
pnpm dev
# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings
Quick Start (Docker Compose Version) 
# 1. Install dependencies
pnpm i
# 2. Create environment variable files and fill in the required values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.
# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up
Open http://localhost:3000 in your browser to get started.
Environment Variables
The pnpm i
command generates a .env
file. Add your API keys there.
# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api
# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****
# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name
# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false
# === OAuth Settings (Optional) ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=
MCP Server Setup
You can connect MCP tools via:
- UI Setup: Go to http://localhost:3000/mcp and configure through the interface.
- Custom Logic: Edit
./custom-mcp-server/index.ts
to implement your own logic, this also doesn’t run on vercel or docker. - File based for local dev: make .mcp-config.json and put your servers in there. Only works in local dev, no docker or vercel env variable required. For example
// .mcp-config.json
{
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
},
}
Tips & Guides
Here are some practical tips and guides for using MCP Client Chatbot:
Docker Hosting Guide:
Learn how to set up docker.
Vercel Hosting Guide:
Learn how to set up vercel.
OAuth Setup Guide (Google & GitHub):
Learn how to configure Google and GitHub OAuth for login functionality.
Project Feature with MCP Server:
Learn how to integrate system instructions and structures with MCP servers to build an agent that assists with GitHub-based project management.
Roadmap: Next Features
MCP Client Chatbot is evolving with these upcoming features:
Deployment & Hosting 
- Self Hosting:
- Easy deployment with Docker Compose
- Vercel deployment support (MCP Server: SSE only)
- Easy deployment with Docker Compose
File & Image
- File Attach & Image Generation:
- File upload and image generation
- Multimodal conversation support
MCP Workflow
- MCP Flow:
- Workflow automation with MCP Server integration
Built-in Tools & UX
- Default Tools for Chatbot:
- Collaborative document editing (like OpenAI Canvas: user & assistant co-editing)
- RAG (Retrieval-Augmented Generation)
- Useful built-in tools for chatbot UX (usable without MCP)
LLM Code Write (with Daytona)
- LLM-powered code writing and editing using Daytona integration
- Seamless LLM-powered code writing, editing, and execution in a cloud development environment via Daytona integration. Instantly generate, modify, and run code with AI assistance—no local setup required.
If you have suggestions or need specific features, please create an issue!
Contributing
We welcome all contributions! Bug reports, feature ideas, code improvements — everything helps us build the best local AI assistant.
Let’s build it together
Join Our Discord
Connect with the community, ask questions, and get support on our official Discord server!
MCP Client Chatbot
Project Details
- AI24K/slyyyde-a1
- MIT License
- Last Updated: 6/13/2025
Recomended MCP Servers
Created with StackBlitz
cest
MCP Server for AI Agent Marketplace Index from DeepNLP
Every time Cursor agent (or Claude) is done with it's tasks, it'll play a sound to notify you...
A Model Context Protocol (MCP) server that provides tools for AI, allowing it to interact with the DataWorks...
Netflix-level subtitle cutting, translation, alignment, and even dubbing - one-click fully automated AI video subtitle team | Netflix级字幕切割、翻译、对齐、甚至加上配音,一键全自动视频搬运AI字幕组
A mcp server provide infomation from pkg.go.dev. For all golang programmers
The most reliable AI agent framework that supports MCP.