Databricks MCP Server
This is a Model Context Protocol (MCP) server for executing SQL queries against Databricks using the Statement Execution API. It can retrieve data by performing SQL requests using the Databricks API. When used in an Agent mode, it can successfully iterate over a number of requests to perform complex tasks. It is even better when coupled with Unity Catalog Metadata.
Features
- Execute SQL queries on Databricks
- List available schemas in a catalog
- List tables in a schema
- Describe table schemas
Setup
System Requirements
- Python 3.10+
- If you plan to install via
uv, ensure it’s installed
Installation
- Install the required dependencies:
pip install -r requirements.txt
Or if using uv:
uv pip install -r requirements.txt
Set up your environment variables:
Option 1: Using a .env file (recommended)
Create a .env file with your Databricks credentials:
DATABRICKS_HOST=your-databricks-instance.cloud.databricks.com DATABRICKS_TOKEN=your-databricks-access-token DATABRICKS_SQL_WAREHOUSE_ID=your-sql-warehouse-idOption 2: Setting environment variables directly
export DATABRICKS_HOST="your-databricks-instance.cloud.databricks.com" export DATABRICKS_TOKEN="your-databricks-access-token" export DATABRICKS_SQL_WAREHOUSE_ID="your-sql-warehouse-id"
You can find your SQL warehouse ID in the Databricks UI under SQL Warehouses.
Permissions Requirements
Before using this MCP server, ensure that:
SQL Warehouse Permissions: The user associated with the provided token must have appropriate permissions to access the specified SQL warehouse. You can configure warehouse permissions in the Databricks UI under SQL Warehouses > [Your Warehouse] > Permissions.
Token Permissions: The personal access token used should have the minimum necessary permissions to perform the required operations. It is strongly recommended to:
- Create a dedicated token specifically for this application
- Grant read-only permissions where possible to limit security risks
- Avoid using tokens with workspace-wide admin privileges
Data Access Permissions: The user associated with the token must have appropriate permissions to access the catalogs, schemas, and tables that will be queried.
To set SQL warehouse permissions via the Databricks REST API, you can use:
GET /api/2.0/sql/permissions/warehouses/{warehouse_id}to check current permissionsPATCH /api/2.0/sql/permissions/warehouses/{warehouse_id}to update permissions
For security best practices, consider regularly rotating your access tokens and auditing query history to monitor usage.
Running the Server
Standalone Mode
To run the server in standalone mode:
python main.py
This will start the MCP server using stdio transport, which can be used with Agent Composer or other MCP clients.
Using with Cursor
To use this MCP server with Cursor, you need to configure it in your Cursor settings:
- Create a
.cursordirectory in your home directory if it doesn’t already exist - Create or edit the
mcp.jsonfile in that directory:
mkdir -p ~/.cursor
touch ~/.cursor/mcp.json
- Add the following configuration to the
mcp.jsonfile, replacing the directory path with the actual path to where you’ve installed this server:
{
"mcpServers": {
"databricks": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/mcp-databricks-server",
"run",
"main.py"
]
}
}
}
If you’re not using uv, you can use python instead:
{
"mcpServers": {
"databricks": {
"command": "python",
"args": [
"/path/to/your/mcp-databricks-server/main.py"
]
}
}
}
- Restart Cursor to apply the changes
Now you can use the Databricks MCP server directly within Cursor’s AI assistant.
Available Tools
The server provides the following tools:
execute_sql_query: Execute a SQL query and return the resultsexecute_sql_query(sql: str) -> strlist_schemas: List all available schemas in a specific cataloglist_schemas(catalog: str) -> strlist_tables: List all tables in a specific schemalist_tables(schema: str) -> strdescribe_table: Describe a table’s schemadescribe_table(table_name: str) -> str
Example Usage
In Agent Composer or other MCP clients, you can use these tools like:
execute_sql_query("SELECT * FROM my_schema.my_table LIMIT 10")
list_schemas("my_catalog")
list_tables("my_catalog.my_schema")
describe_table("my_catalog.my_schema.my_table")
Handling Long-Running Queries
The server is designed to handle long-running queries by polling the Databricks API until the query completes or times out. The default timeout is 10 minutes (60 retries with 10-second intervals), which can be adjusted in the dbapi.py file if needed.
Dependencies
- httpx: For making HTTP requests to the Databricks API
- python-dotenv: For loading environment variables from .env file
- mcp: The Model Context Protocol library
- asyncio: For asynchronous operations
Databricks MCP Server
Project Details
- RafaelCartenet/mcp-databricks-server
- Last Updated: 4/14/2025
Categories
Recomended MCP Servers
Inkdrop Model Context Protocol Server
MCP server for querying the Shodan API
use Bitget’s API to get cryptocurrency info
🔍 Enable AI assistants to search, access, and analyze PubMed articles through a simple MCP interface.
MCP Server for Trino
Atom of Thoughts (AoT) MCP is a server that decomposes complex problems into independent atomic units of thought,...
MCP Server MetaMCP manages all your other MCPs in one MCP.
A WooCommerce (MCP) Model Context Protocol server
这个项目是一个基于Model Context Protocol (MCP)的AutoCAD集成服务器,它允许通过自然语言与AutoCAD进行交互。通过这个服务器,用户可以使用Claude等大型语言模型来创建、修改和分析AutoCAD图纸,同时还可以存储和查询CAD元素的相关数据。目前制作参考学习,仅实现端到端之间的通信,具体工具函数尚未晚上
DBT CLI MCP Server





