MCP Server for Agent8
A server implementing the Model Context Protocol (MCP) to support Agent8 SDK development. Developed with TypeScript and pnpm, supporting stdio and SSE transports.
Features
This Agent8 MCP Server implements the following MCP specification capabilities:
Prompts
- System Prompt for Agent8 SDK: Provides optimized guidelines for Agent8 SDK development through the
system-prompt-for-agent8-sdk
prompt template.
Tools
- Code Examples Search: Retrieves relevant Agent8 game development code examples from a vector database using the
search_code_examples
tool. - Game Resource Search: Searches for game development assets (sprites, animations, sounds, etc.) using semantic similarity matching via the
search_game_resources
tool. - Asset Generation: Generates game assets including static images and cinematics using the
static_asset_generate
andcinematic_asset_generate
tools.
Installation
# Install dependencies
pnpm install
# Build
pnpm build
Using Docker
You can run this application using Docker in several ways:
Option 1: Pull from GitHub Container Registry (Recommended)
# Pull the latest image
docker pull ghcr.io/planetarium/mcp-agent8:latest
# Run the container
docker run -p 3333:3333 --env-file .env ghcr.io/planetarium/mcp-agent8:latest
Option 2: Build Locally
# Build the Docker image
docker build -t agent8-mcp-server .
# Run the container with environment variables
docker run -p 3333:3333 --env-file .env agent8-mcp-server
Docker Environment Configuration
There are three ways to configure environment variables when running with Docker:
Using
--env-file
(Recommended):# Create and configure your .env file first cp .env.example .env nano .env # Run with .env file docker run -p 3000:3000 --env-file .env agent8-mcp-server
Using individual
-e
flags:docker run -p 3000:3000 -e SUPABASE_URL=your_supabase_url -e SUPABASE_SERVICE_ROLE_KEY=your_service_role_key -e OPENAI_API_KEY=your_openai_api_key -e MCP_TRANSPORT=sse -e PORT=3000 -e LOG_LEVEL=info agent8-mcp-server
Using Docker Compose (for development/production setup):
The project includes a pre-configured
docker-compose.yml
file with:- Automatic port mapping from .env configuration
- Environment variables loading
- Volume mounting for data persistence
- Container auto-restart policy
- Health check configuration
To run the server:
docker compose up
To run in detached mode:
docker compose up -d
Required Environment Variables:
SUPABASE_URL
: Supabase URL for database connectionSUPABASE_SERVICE_ROLE_KEY
: Supabase service role key for authenticationOPENAI_API_KEY
: OpenAI API key for AI functionality
The Dockerfile uses a multi-stage build process to create a minimal production image:
- Uses Node.js 20 Alpine as the base image for smaller size
- Separates build and runtime dependencies
- Only includes necessary files in the final image
- Exposes port 3000 by default
Usage
Command Line Options
# View help
pnpm start --help
# View version information
pnpm start --version
Supported options:
--debug
: Enable debug mode--transport <type>
: Transport type (stdio or sse), default: stdio--port <number>
: Port to use for SSE transport, default: 3000--log-destination <dest>
: Log destination (stdout, stderr, file, none)--log-file <path>
: Path to log file (when log-destination is file)--log-level <level>
: Log level (debug, info, warn, error), default: info--env-file <path>
: Path to .env file
Using Environment Variables
The server supports configuration via environment variables, which can be set directly or via a .env
file.
- Create a
.env
file in the project root (see.env.example
for reference):
# Copy the example file
cp .env.example .env
# Edit the .env file with your settings
nano .env
- Run the server (it will automatically load the
.env
file):
pnpm start
- Or specify a custom path to the
.env
file:
pnpm start --env-file=/path/to/custom/.env
Configuration Priority
The server uses the following priority order when determining configuration values:
- Command line arguments (highest priority)
- Environment variables (from
.env
file or system environment) - Default values (lowest priority)
This allows you to set baseline configuration in your .env
file while overriding specific settings via command line arguments when needed.
Supported Environment Variables
Variable | Description | Default |
---|---|---|
MCP_TRANSPORT | Transport type (stdio or sse) | stdio |
PORT | Port to use for SSE transport | 3000 |
LOG_LEVEL | Log level (debug, info, warn, error) | info |
LOG_DESTINATION | Log destination (stdout, stderr, file, none) | stderr (for stdio transport), stdout (for sse transport) |
LOG_FILE | Path to log file (when LOG_DESTINATION is file) | (none) |
DEBUG | Enable debug mode (true/false) | false |
AUTH_API_ENDPOINT | Authentication API endpoint URL | (none) |
REQUIRE_AUTH | Require authentication for API endpoints | false |
SUPABASE_URL | Supabase URL for database connection | (required) |
SUPABASE_SERVICE_ROLE_KEY | Supabase service role key for authentication | (required) |
OPENAI_API_KEY | OpenAI API key for AI functionality | (required) |
ENABLE_ALL_TOOLS | Enable or disable all tools globally | true |
ENABLE_VECTOR_SEARCH_TOOLS | Enable or disable all vector search tools | true |
ENABLE_ASSET_GENERATE_TOOLS | Enable or disable all asset generation tools | true |
ENABLE_CODE_EXAMPLE_SEARCH_TOOL | Enable or disable code example search tool | true |
ENABLE_GAME_RESOURCE_SEARCH_TOOL | Enable or disable game resource search tool | true |
Tool Activation Priority: The tool activation settings follow this priority order:
- Individual tool settings (e.g.,
ENABLE_CODE_EXAMPLE_SEARCH_TOOL
) - Tool group settings (e.g.,
ENABLE_VECTOR_SEARCH_TOOLS
,ENABLE_ASSET_GENERATE_TOOLS
) - Global tool setting (
ENABLE_ALL_TOOLS
)
For example, if you set ENABLE_ALL_TOOLS=false
but ENABLE_VECTOR_SEARCH_TOOLS=true
, only vector search tools will be enabled while other tools remain disabled. Similarly, individual tool settings override their respective group settings.
Examples:
# Enable only vector search tools
ENABLE_ALL_TOOLS=false
ENABLE_VECTOR_SEARCH_TOOLS=true
# Enable only asset generation tools
ENABLE_ALL_TOOLS=false
ENABLE_ASSET_GENERATE_TOOLS=true
# Disable a specific tool while keeping others enabled
ENABLE_ALL_TOOLS=true
ENABLE_CODE_EXAMPLE_SEARCH_TOOL=false
Using Stdio Transport
# Build and run
pnpm build
pnpm start --transport=stdio
Using SSE Transport
# Build and run (default port: 3000)
pnpm build
pnpm start --transport=sse --port=3000
Debug Mode
# Run in debug mode
pnpm start --debug
Available Prompts
systemprompt-agent8-sdk
Client Integration
Using with Claude Desktop
- Add the following to Claude Desktop configuration file (
claude_desktop_config.json
):
{
"mcpServers": {
"Agent8": {
"command": "npx",
"args": ["--yes", "agent8-mcp-server"]
}
}
}
- Restart Claude Desktop
Adding New Prompts
Add new prompts to the registerSamplePrompts
method in the src/prompts/provider.ts
file.
License
MIT
MCP Server for Agent8
Project Details
- planetarium/mcp-agent8
- Last Updated: 4/18/2025
Recomended MCP Servers
Plug FamilySearch into Claude and Cursor AI
A Model Context Protocol (MCP) server that provides tools to interact with LinkedIn's Feeds and Job API.
A Figma API server implementation based on Model Context Protocol (MCP), supporting Figma plugin and widget integration.
这个项目是一个基于Model Context Protocol (MCP)的AutoCAD集成服务器,它允许通过自然语言与AutoCAD进行交互。通过这个服务器,用户可以使用Claude等大型语言模型来创建、修改和分析AutoCAD图纸,同时还可以存储和查询CAD元素的相关数据。目前制作参考学习,仅实现端到端之间的通信,具体工具函数尚未晚上
An MCP (Model Context Protocol) server for interacting with Android devices through ADB in TypeScript.
A Model Context Protocol (MCP) server for interacting with the OneSignal API
An MCP (Model Context Protocol) server for generating Xmind mind maps. This server allows LLMs to create structured...
A MCP (Model Context Protocol) server that provides get, send Gmails without local credential or token setup.
Implementation of Anthropic's MCP protocol for Firebird databases.