Picard MCP Server: Supercharging LLMs with Secure Memory Management
In the rapidly evolving landscape of AI and Large Language Models (LLMs), the ability to provide context and memory to these models is paramount. The Picard MCP Server, available on the UBOS Asset Marketplace, offers a complete memory management solution built on the Model Context Protocol (MCP) standard. This innovative system empowers LLMs with persistent, secure, and semantically searchable memory, enabling more sophisticated and context-aware AI applications.
What is the Model Context Protocol (MCP)?
Before diving into the details of the Picard MCP Server, it’s essential to understand the foundation upon which it’s built: the Model Context Protocol (MCP). MCP is an open standard that defines how applications can provide context to LLMs. It acts as a bridge, allowing AI models to access and interact with external data sources and tools. Think of it as a universal language that allows LLMs to converse with the world around them.
MCP compliance ensures that the Picard MCP Server adheres to this standard, exposing:
- Resources: Read-only endpoints providing data (memory content) to LLMs.
- Tools: Functional endpoints for performing actions (memory creation, updates, queries).
- Authentication: OAuth 2.0 implementation for secure access to protected resources.
Picard MCP Server: A Deep Dive
The Picard MCP Server is a FastAPI-based implementation of the Model Context Protocol. It’s designed to provide secure memory storage and retrieval services for LLMs. The server is complemented by a Django client application, offering a practical demonstration of how to integrate with the MCP server. Together, these components enable users to:
- Store, retrieve, and manage their memories.
- Control access permissions (private/public).
- Perform semantic search and AI-powered queries based on stored memories.
Key Features and Components
MCP Server (FastMCP):
- OAuth 2.0 Authentication: Securely manages access with PKCE support.
- Memory Storage: Utilizes PostgreSQL with the pgvector extension for efficient vector storage.
- Permission-Based Access Control: Ensures data privacy with private and public memory options.
- Vector Embeddings: Leverages OpenAI’s
text-embedding-3-smallmodel for semantic search. - LLM Integration: Framework-ready for memory-based queries, with advanced features planned.
Django Client: Provides a user-friendly web interface for interacting with the MCP server.
- User registration and authentication
- OAuth 2.0 client implementation
- Memory creation, retrieval, and management UI
- Persona-based querying interface
System Architecture
The Picard MCP system adopts a client-server architecture, comprising the following core components:
MCP Server (Backend):
- Built with FastAPI for high performance and asynchronous support.
- Employs PostgreSQL with pgvector for vector storage and semantic search.
- Manages data models for Users, Memories, OAuth Clients, and Tokens.
- Utilizes SQLAlchemy ORM with Alembic migrations for database management.
- Implements OAuth 2.0 for secure authentication and authorization.
- Integrates with the OpenAI API for memory embeddings (
text-embedding-3-small). - Uses LangChain for LLM operations (when available).
- Offers both stateful and stateless operation modes.
- Supports streamable HTTP transport for scalability.
Django Client (Frontend):
- Provides user registration, authentication, and profile management.
- Implements an OAuth 2.0 client for secure communication with the MCP server.
- Offers a user-friendly interface for memory management and querying.
- Uses its own PostgreSQL database, separate from the MCP server.
Docker Infrastructure: Facilitates easy setup and scaling through containerization.
- Separate containers for the MCP server (port 8001), Django client (port 8000), and PostgreSQL databases.
- Configured networking for secure inter-container communication.
- Volume mounting for persistent data storage.
- Compatible with local Docker deployment and Render cloud deployment.
Authentication Approaches
The system supports two primary authentication methods:
1. Direct Connect with User Context Token Flow (Recommended)
This simplified approach streamlines authentication, requiring users to authenticate only once with the Django client, thus eliminating the need for separate MCP server authentication.
- Client Registration: The Django client registers with the MCP server using the
/api/admin/clients/registerendpoint. - User Authentication Flow: The user authenticates solely with the Django client. When initiating a connection to the MCP server, the Django client makes a server-side request to the MCP’s
/api/user-tokens/user-tokenendpoint. - API Access: The client incorporates the access token in the Authorization header (
Authorization: Bearer {token}) for all API requests. - Security Features: Exclusively confidential clients can leverage this method, ensuring server-to-server security.
2. Standard OAuth 2.0 Authorization Code Flow with PKCE (Legacy)
The system also supports the standard OAuth 2.0 Authorization Code flow with PKCE for enhanced security, adhering to RFC 6749 and RFC 7636 standards. This approach mandates users to authenticate with both the client and the MCP server.
- Authorization Flow: The user initiates login through the Django client. The client generates a cryptographically secure random
stateparameter for CSRF protection. - Token Exchange: The client verifies that the returned
stateparameter matches the one sent in the authorization request. The client then exchanges the authorization code for access and refresh tokens via the/tokenendpoint. - API Access: Same as in the Direct Connect approach.
Database Models
The MCP Server employs SQLAlchemy ORM with these key models:
- User Model: Stores user information, including email, username, and hashed password. It is linked to memories through a one-to-many relationship.
- Memory Model with Vector Storage: Utilizes the pgvector extension to store and query vector embeddings (1536 dimensions). It supports text content with optional encryption and includes permission controls (private/public).
- OAuth Models: Includes
OAuthClient(stores client application details) andToken(stores access and refresh tokens with expiration tracking).
Memory Management System
The core functionality of Picard MCP revolves around memory management with the following components:
- Memory Storage: Memories are stored as text with associated metadata, including vector embeddings (using the
text-embedding-3-smallmodel) to enable semantic search capabilities. - Permission Management: Each memory has a permission level (private or public), controlling who can access it.
- Memory Retrieval: Users can retrieve their own memories with filtering and sorting options. Semantic search allows for finding memories based on meaning using vector embeddings.
- LLM Integration: Memories can be used as context for LLM queries, enabling users to create personas based on their public memories.
Key Features in Detail
MCP Server Features
- OAuth 2.0 Authentication: Implements the Authorization Code flow with PKCE, offering a scope-based permission system and token management.
- Memory Management: Enables the creation, reading, updating, and deletion of memories, with vector embeddings for semantic search and permission-based access control.
- User Management: Provides user registration and authentication, profile management, and admin controls.
- AI Integration: Integrates with the OpenAI API for embeddings, generates automatic vector embeddings for all memories, and supports semantic search using pgvector cosine similarity.
Django Client Features
- User Interface: Offers a clean, responsive design with an intuitive memory management interface.
- OAuth Client Implementation: Securely stores and manages tokens, with automatic token refresh and scope-based feature availability.
- Memory Tools: Supports memory creation with rich text, batch import and export, and permission management.
MCP Interface: Resources and Tools
The MCP interface defines resources and tools for interacting with memories:
MCP Resources
- Memory Resource:
memories://{memory_id}- Returns the content of a specific memory with permission checks. - User Memories Resource:
users://{user_id}/memories- Returns a list of memories for a specific user with permission checks.
MCP Tools
- Submit Memory Tool: Creates a new memory.
- Update Memory Tool: Updates an existing memory.
- Delete Memory Tool: Deletes a memory.
- Query Memory Tool: Performs semantic search on memories.
- Query User: Queries a user’s persona based on memories.
API Endpoints: Detailed Functionality
The API endpoints provide access to the core functionalities of the MCP server:
OAuth Endpoints
- /register: Registers a new OAuth client.
- /authorize: Initiates the OAuth authorization flow.
- /token: Exchanges the authorization code for tokens.
Memory Endpoints
- /api/tools (tool:
get_memories): Retrieves memories with optional filtering. - /api/tools (tool:
submit_memory): Creates a new memory. - /api/tools (tool:
retrieve_memories): Retrieves all memories for the authenticated user. - /api/tools (tool:
update_memory): Updates an existing memory. - /api/tools (tool:
modify_permissions): Updates memory permission level. - /api/tools (tool:
query_memory): Performs semantic search on user’s memories. - /api/tools (tool:
query_user): Queries a user’s persona based on memories.
Setting Up and Deploying Picard MCP
The setup process involves several key steps:
- Prerequisites: Docker, Docker Compose, Python 3.10+, and an OpenAI API key.
- Cloning the Repository: Clone the repository from GitHub.
- Environment Configuration: Create and edit environment files for both the MCP server and the Django client.
- Starting Services: Start the services using Docker Compose.
- Admin User Creation: Create an admin user for the MCP server.
- Client Registration: Register the Django client with the MCP server.
- Accessing Applications: Access the MCP server and Django client via their respective URLs.
Testing the Setup
Verify the setup is working correctly by running MCP server and Django client tests. Manual testing involves creating a user account, logging in, connecting to the MCP server, creating and managing memories, and testing semantic search functionality.
Security Considerations
Data Protection
Memory text content is encrypted at rest, and personal identifiable information (PII) is protected through text field encryption. Access tokens have a 1-hour expiration time, and refresh tokens are long-lived but use rotation.
UUID Usage
All identifiers use UUID v4 format instead of sequential integers for security, scalability, non-guessability, and consistency reasons.
OAuth Best Practices
All OAuth communication must use HTTPS, and PKCE is required for all clients, even confidential ones.
Documentation and Deployment
The MCP server includes Swagger/OpenAPI documentation for all endpoints, accessible at /docs. Additional documentation files cover testing, debugging, and planning. The project includes a docker-compose.yml for local development and a render.yaml blueprint for deploying to Render.
License
The Picard MCP Server is licensed under the MIT License.
Use Cases
- Contextual AI Agents: Equip AI Agents with the ability to remember past interactions and learn from experience.
- Personalized AI Assistants: Create AI assistants that adapt to individual user preferences and needs.
- Secure Knowledge Management: Build secure and private knowledge bases for sensitive information.
- Enhanced Semantic Search: Improve the accuracy and relevance of search results by leveraging semantic understanding.
Integrate Picard MCP Server with UBOS Platform
The UBOS platform is a full-stack AI Agent Development Platform that focuses on bringing AI Agents to every business department. Integrating Picard MCP Server into the UBOS platform empowers your AI Agents with long-term memory and contextual awareness, enabling them to perform more sophisticated tasks and provide more personalized experiences.
Here’s how the integration benefits you:
- Orchestrate AI Agents: UBOS helps you orchestrate multiple AI Agents, allowing them to collaborate and share information seamlessly. By integrating Picard MCP Server, you can ensure that each agent has access to a shared memory pool, enabling them to learn from each other’s experiences and improve their overall performance.
- Connect with Enterprise Data: UBOS allows you to connect your AI Agents with your enterprise data sources, such as databases, CRMs, and file systems. By integrating Picard MCP Server, you can enrich your enterprise data with semantic information and make it more accessible to your AI Agents.
- Build Custom AI Agents: UBOS provides a flexible framework for building custom AI Agents tailored to your specific business needs. By integrating Picard MCP Server, you can easily add memory and contextual awareness to your custom agents, making them more intelligent and capable.
- Multi-Agent Systems: UBOS supports the creation of Multi-Agent Systems, where multiple AI Agents work together to solve complex problems. By integrating Picard MCP Server, you can enable your Multi-Agent Systems to share knowledge and coordinate their actions more effectively.
By leveraging the Picard MCP Server within the UBOS platform, you can unlock the full potential of AI Agents and create innovative solutions that drive business value. It provides more security, more reliable and better AI Agents for your business.
In conclusion, the Picard MCP Server offers a robust and secure memory management solution for LLMs, enabling a new generation of context-aware AI applications. Its compliance with the Model Context Protocol, coupled with its advanced features and ease of deployment, makes it a valuable asset for any organization looking to enhance the capabilities of their AI models.
Political Preferences Management Server
Project Details
- hburgoyne/picard_mcp
- Last Updated: 5/29/2025
Recomended MCP Servers
Amadeus MCP(Model Context Protocol) Server
Application for SEO automation and AI-powered optimization
A simple MCP application that delivers curated positive and uplifting news stories.
mcp-server
Rest To Postman Collection MCP Server
Typescript implementation of MCP server for Valyu Network API (https://docs.valyu.network/api-reference)
A mongo db server for the model context protocol (MCP)
An MCP proxy server that aggregates and serves multiple MCP resource servers through a single HTTP server.
Official Firecrawl MCP Server - Adds powerful web scraping to Cursor, Claude and any other LLM clients.





