Frequently Asked Questions (FAQ) - MCP Server
Q: What is an MCP Server?
A: An MCP (Model Context Protocol) Server acts as a bridge, allowing AI models to access and interact with external data sources and tools. It standardizes how applications provide context to Large Language Models (LLMs), enabling more informed decision-making and automation.
Q: What are the key components of the MCP Server?
A: The MCP Server includes n8n (automation platform), Ollama (local LLM platform), Qdrant (vector store), Prometheus (monitoring), Grafana (visualization), Whisper (speech-to-text), Caddy (HTTPS/TLS), Supabase (database & auth), Flowise (AI agent builder), Open WebUI (ChatGPT-like interface), and SearXNG (privacy-focused search engine).
Q: What are the minimum system requirements for running the MCP Server?
A: You need an Ubuntu VPS (tested on 22.04 LTS), a domain name with DNS access, a minimum of 16GB RAM, 100GB+ storage, and Docker/Docker Compose installed.
Q: How do I install the MCP Server?
A: Connect to your VPS via SSH, install the required packages, configure the firewall, clone the repository, run the configuration script, and use the start_services.py
script to start the AI stack.
Q: How do I access the different services after installation?
A: You can access the services via the provided URLs, such as https://n8n.kwintes.cloud
for n8n, https://openwebui.kwintes.cloud
for the Web UI, and so on.
Q: How do I update the MCP Server to the latest version?
A: Use the provided update_stack.sh
script to pull the latest Docker images, apply configuration fixes, and restart all services.
Q: How do I back up the MCP Server data?
A: Use the backup_stack.sh
script to back up all Docker volumes, configuration files, and secrets to a timestamped archive.
Q: What if I encounter Docker Compose issues during installation?
A: The script automatically detects and uses the correct Docker Compose command format for your system. If you’re manually running commands, use docker compose
or docker-compose
depending on your setup. If issues persist, install the standalone Docker Compose binary.
Q: How does the MCP Server ensure data privacy and security?
A: All AI models run locally on your VPS, preventing data from being sent to external services. Automatic HTTPS/TLS encryption, firewall rules, and secure secret management provide additional layers of security.
Q: How can I integrate the MCP Server with the UBOS Platform?
A: Use the UBOS Platform to orchestrate AI Agents running on your MCP Server, connect them to enterprise data, build custom AI Agents, and create sophisticated Multi-Agent Systems.
Q: What models are installed automatically?
A: Several models are automatically installed, including language models like gemma3:12b
, granite3-guardian:8b
, llama3.2-vision
, and embedding models like granite-embedding:278m
and nomic-embed-text:latest
. These models are downloaded during the initial setup.
Q: How do I monitor the health and performance of my MCP Server?
A: Access Grafana to view dashboards or Prometheus to directly check the targets and create alerts. These tools provide detailed insights into service health and performance metrics.
Local AI Stack for VPS Deployment
Project Details
- ThijsdeZeeuw/avg-kwintes
- Apache License 2.0
- Last Updated: 4/1/2025
Recomended MCP Servers
FreecadMCP connects Freecad to Claude AI and other MCP-ready tools like Cursor through the Model Context Protocol (MCP),...
A Model Context Protocol server for retrieving and analyzing issues from Sentry.io
An MCP server implementation providing a standardized interface for LLMs to interact with the Atla API.
Agent Framework / shim to use Pydantic with LLMs
A Model Context Protocol (MCP) server that enables AI assistants to interact with IDA Pro for reverse engineering...
An MCP Server for interacting with Reaper projects.
MCP tool that lets Cline inquire about a code base
A simple Model Context Protocol (MCP) server that integrates with Notion's API to manage my personal todo list.
A Model Context Protocol server for Google Workspace integration (Gmail and Calendar)