MCP Command Server
A secure Model Context Protocol (MCP) server for executing system commands through LLM applications like Claude.
Quick Start
- Install the package:
uv pip install mcp-command-server
- Configure allowed commands:
export ALLOWED_COMMANDS="ls,pwd,echo"
- Add to Claude Desktop configuration:
{
"mcpServers": {
"command-server": {
"command": "uv",
"args": ["run", "python", "-m", "mcp_command_server"],
"env": {
"ALLOWED_COMMANDS": "ls,pwd,echo"
}
}
}
}
Features
- 🔒 Secure command execution with whitelist
- ✅ User confirmation for all commands
- 📝 Comprehensive audit logging
- 🔍 Input validation and sanitization
- 🤖 Claude Desktop integration
Documentation
For complete documentation, see the docs/ directory:
- Installation Guide
- Security Guidelines
- API Reference
- Usage Examples
- Troubleshooting
Development
Setup
# Clone repository
git clone https://github.com/yourusername/mcp-command-server.git
cd mcp-command-server
# Create virtual environment
uv venv
source .venv/bin/activate # On Unix/macOS
.venvScriptsactivate # On Windows
# Install development dependencies
uv pip install -e ".[dev]"
Testing
# Run all tests
pytest
# Run specific test file
pytest tests/unit/security/test_validator.py
# Run with coverage
pytest --cov=mcp_command_server
Contributing
- Fork the repository
- Create your feature branch
- Run tests and linting
- Submit a pull request
License
MIT License - see LICENSE for details.
Command Server
Project Details
- Andrew-Beniash/mcp-command-server
- Last Updated: 1/30/2025
Recomended MCP Servers
MCP Server MetaMCP manages all your other MCPs in one MCP.
🔍 Unified Search MCP Server - Search across Google Scholar, Web, and YouTube with a single query
MCP Server for DealX platform
Bring your project into LLM context - tool and MCP server
DreamFactory MCP Server enables AI assistants like Claude to directly query your databases through DreamFactory's auto-generated REST APIs.
MCP Markdownify Server with UTF-8 Support - Enhanced version with better multilingual handling
Prompt, run, edit, and deploy full-stack web applications using any LLM you want!





