What is the Ollama MCP Server?
The Ollama MCP Server is a platform that facilitates the integration of Ollama’s local LLM models with MCP-compatible applications, enhancing AI model accessibility and functionality.
How do I install the Ollama MCP Server?
You can install the Ollama MCP Server globally via npm using the command npm install -g @rawveg/ollama-mcp. Ensure you have Node.js (v16 or higher) and npm installed.
What are the prerequisites for running the Ollama MCP Server?
The prerequisites include having Node.js (v16 or higher), npm, and Ollama installed and running locally.
Can the Ollama MCP Server be integrated with other applications?
Yes, it can be integrated with other MCP-compatible applications by configuring the application’s MCP settings file.
What is the default port for the Ollama MCP Server?
The server starts on port 3456 by default, but you can specify a different port using the PORT environment variable.
What license is the Ollama MCP Server under?
The Ollama MCP Server is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).
Ollama MCP Server
Project Details
- rawveg/ollama-mcp
- @rawveg/ollama-mcp
- GNU Affero General Public License v3.0
- Last Updated: 4/21/2025
Categories
Recomended MCP Servers
The Okta MCP Server is a groundbreaking tool built by the team at Fctr that enables AI models...
An integration that allows LLMs to interact with Raindrop.io bookmarks using the Model Context Protocol (MCP).
A yara based MCP Server
mcp server for gitingest
A mcp server that bridges Dune Analytics data to AI agents.
MCP Server for Apache Airflow
A Model Context Protocol Server for Home Assistant
Integration of Needle in modelcontextprotocol
Collection of Canvas LMS and Gradescope tools for the ultimate EdTech model context protocol. Allows you to query...





