What is Context Portal (ConPort)?
Context Portal (ConPort) is a memory bank for your software project. It’s a tool that helps AI assistants understand your specific software project better by storing important information like decisions, tasks, and architectural patterns in a structured way.
How does ConPort improve AI assistant performance?
ConPort improves AI assistant performance by providing a structured and easily accessible knowledge base of project-specific information. This allows AI to deliver more accurate and helpful responses.
What are the key features of ConPort?
Key features include structured context storage, MCP server implementation, multi-workspace support, STDIO deployment mode, dynamic project knowledge graph, vector data storage, semantic search, RAG backend, prompt caching, and database schema evolution.
What is Model Context Protocol (MCP)?
MCP is an open protocol that standardizes how applications provide context to LLMs.
What is Retrieval Augmented Generation (RAG) and how does ConPort support it?
RAG is a framework that improves the quality of AI-generated text by retrieving relevant information from an external knowledge source and incorporating it into the generation process. ConPort acts as a backend for RAG by storing project-specific knowledge and enabling AI to retrieve this information for contextualized responses.
How do I install and configure ConPort?
The recommended way to install and run ConPort is by using uvx to execute the package directly from PyPI. Alternatively, you can install it from the Git repository.
What is the purpose of the --workspace_id command-line argument?
The --workspace_id argument provides the server process with the absolute path to the project workspace, performs a safety check to prevent database creation in the installation directory, and signals the client which project is being launched.
How can I optimize LLM agents to work with ConPort?
Optimize LLM agents by providing custom instructions or system prompts that guide them on how to use ConPort’s tools for context management. This repository includes tailored strategy files for different environments.
How does prompt caching work with ConPort?
ConPort helps to identify cacheable content and provide instructions on structuring prompts for different LLM providers to reduce token costs and latency by reusing frequently used parts of prompts.
Where can I find the strategy files for LLM agents?
You can find the strategy files in the conport-custom-instructions directory of this repository. There are specific strategies for Roo Code, CLine, Windsurf Cascade, and general use.
Context Portal
Project Details
- GreatScottyMac/context-portal
- Apache License 2.0
- Last Updated: 6/16/2025
Recomended MCP Servers
Simplest remote MCP server implementation
HTTP-4-MCP configuration tool allows you to easily convert HTTP API to MCP tool without writing code. With simple...
This project provides a toolset to crawl websites wikis, tool/library documentions and generate Markdown documentation, and make that...
hunter-io-mcp-server
A MCP for searching and downloading academic papers from multiple sources like arXiv, PubMed, bioRxiv, etc.
Apollo MCP Server
MCP server implementation for Kibela API integration
A Model Context Protocol (MCP) server for reading Excel files with automatic chunking and pagination support. Built with...
MCP tool for building Xcode iOS workspace/project and feeding back error to LLMs.





