✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more

What is Context Portal (ConPort)?

Context Portal (ConPort) is a memory bank for your software project. It’s a tool that helps AI assistants understand your specific software project better by storing important information like decisions, tasks, and architectural patterns in a structured way.

How does ConPort improve AI assistant performance?

ConPort improves AI assistant performance by providing a structured and easily accessible knowledge base of project-specific information. This allows AI to deliver more accurate and helpful responses.

What are the key features of ConPort?

Key features include structured context storage, MCP server implementation, multi-workspace support, STDIO deployment mode, dynamic project knowledge graph, vector data storage, semantic search, RAG backend, prompt caching, and database schema evolution.

What is Model Context Protocol (MCP)?

MCP is an open protocol that standardizes how applications provide context to LLMs.

What is Retrieval Augmented Generation (RAG) and how does ConPort support it?

RAG is a framework that improves the quality of AI-generated text by retrieving relevant information from an external knowledge source and incorporating it into the generation process. ConPort acts as a backend for RAG by storing project-specific knowledge and enabling AI to retrieve this information for contextualized responses.

How do I install and configure ConPort?

The recommended way to install and run ConPort is by using uvx to execute the package directly from PyPI. Alternatively, you can install it from the Git repository.

What is the purpose of the --workspace_id command-line argument?

The --workspace_id argument provides the server process with the absolute path to the project workspace, performs a safety check to prevent database creation in the installation directory, and signals the client which project is being launched.

How can I optimize LLM agents to work with ConPort?

Optimize LLM agents by providing custom instructions or system prompts that guide them on how to use ConPort’s tools for context management. This repository includes tailored strategy files for different environments.

How does prompt caching work with ConPort?

ConPort helps to identify cacheable content and provide instructions on structuring prompts for different LLM providers to reduce token costs and latency by reusing frequently used parts of prompts.

Where can I find the strategy files for LLM agents?

You can find the strategy files in the conport-custom-instructions directory of this repository. There are specific strategies for Roo Code, CLine, Windsurf Cascade, and general use.

Featured Templates

View More

Start your free trial

Build your solution today. No credit card required.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.