What is the MCP Server for Qdrant?
The MCP Server for Qdrant is an official implementation of the Model Context Protocol (MCP) that provides a semantic memory layer on top of the Qdrant vector search engine, enabling seamless integration between LLM applications and external data sources.
How does the MCP Server enhance data retrieval?
The server uses tools like qdrant-store and qdrant-find to store and retrieve information in the Qdrant database, allowing for efficient semantic queries and data management.
Can the MCP Server be integrated with other tools?
Yes, the MCP Server for Qdrant is compatible with various MCP-compatible clients, including Cursor, VS Code, and Claude Code, making it easy to integrate into existing workflows.
What models does the MCP Server support?
The server supports FastEmbed models and uses the sentence-transformers/all-MiniLM-L6-v2 embedding model by default for efficient semantic data encoding and retrieval.
How does the MCP Server integrate with the UBOS platform?
The MCP Server for Qdrant integrates seamlessly with the UBOS platform, enhancing its capabilities for orchestrating AI agents and connecting them with enterprise data.
Qdrant Server
Project Details
- qdrant/mcp-server-qdrant
- Apache License 2.0
- Last Updated: 4/22/2025
Recomended MCP Servers
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
An MCP server to read MCP logs to debug directly inside the client
A Model Context Protocol (MCP) server for Kubernetes that enables AI assistants like Claude, Cursor, and others to...
A Model Context Protocol server for interacting with Babashka, a native Clojure interpreter for scripting
MCP server for enabling LLM applications to perform deep research via the MCP protocol
MCP server that creates its own tools as needed
A open-source library enabling AI models to control hardware devices via serial communication using the MCP protocol. Initial...





