✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more

BloodHound MCP

BloodHound MCP (Model Context Protocol) is an innovative extension of the BloodHound tool, designed to enable Large Language Models (LLMs) to interact with and analyze Active Directory (AD) and Azure Active Directory (AAD) environments through natural language queries. By leveraging the power of LLMs, BloodHound MCP allows users to perform complex queries and retrieve insights from their AD/AAD environments using simple, conversational commands.

Features

  • Natural Language Queries: Use conversational language to query your AD/AAD environment without needing to write Cypher queries manually.
  • LLM-Powered Analysis: Harness the capabilities of Large Language Models to interpret and execute queries on your behalf.
  • Seamless Integration: Works with existing BloodHound data stored in Neo4j, providing a user-friendly interface for complex analysis.
  • Customizable: Easily configure the system to work with your specific environment and tools.

Configure the MCP Server

{
  "mcpServers": {
    "BloodHound": {
      "name": "BloodHound",
      "isActive": true,
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp[cli],neo4j",
        "mcp",
        "run",
        "<PATH_TO_THE_PROJECT>server.py"
      ],
      "env": {
        "BLOODHOUND_URI": "bolt://localhost:7687",
        "BLOODHOUND_USERNAME": "neo4j",
        "BLOODHOUND_PASSWORD": "bloodhound"
      }
    }
  }
}

Usage

Configuration

To customize BloodHound MCP, update the configuration file in your MCP-supported tool. Key settings include:

  • Neo4j Database Connection:
    • BLOODHOUND_URI: The URI of your Neo4j database (e.g., bolt://localhost:7687).
    • BLOODHOUND_USERNAME: Your Neo4j username.
    • BLOODHOUND_PASSWORD: Your Neo4j password.
  • Server Settings: Adjust the command and args to match your environment and tool requirements.

Contributing

We welcome contributions to BloodHound MCP! To get involved:

  1. Fork the Repository: Create your own copy on GitHub.
  2. Create a Branch: Work on your feature or fix in a new branch.
  3. Submit a Pull Request: Include a clear description of your changes.

Special Thanks

Custom queries from : https://github.com/CompassSecurity/BloodHoundQueries

BloodHound

Project Details

Recomended MCP Servers

Framelink Figma MCP
Framelink Figma MCP

MCP server to provide Figma layout information to AI coding agents like Cursor

🧩
Mentor MCP Server

A Model Context Protocol server providing LLM Agents a second opinion via AI-powered Deepseek-Reasoning R1 mentorship capabilities, including...

🧩
Redmine

MCP server for Redmine

Web Research Server
Web Research Server

MCP web research server (give Claude real-time info from the web)

🧩
Rijksmuseum Server

Rijksmuseum MCP integration for artwork exploration and analysis

🧩
Crawl4AI MCP Server

用于提供给本地开发者的 LLM的高效互联网搜索&内容获取的MCP Server, 节省你的token

🧩
MariaDB Data Retrieval Server

An mcp server that provides read-only access to MariaDB.

🧩
Sefaria Jewish Library

Fetch and read Jewish texts through the API of Sefaria.org

🧩
AutoCAD Integration Server

这个项目是一个基于Model Context Protocol (MCP)的AutoCAD集成服务器,它允许通过自然语言与AutoCAD进行交互。通过这个服务器,用户可以使用Claude等大型语言模型来创建、修改和分析AutoCAD图纸,同时还可以存储和查询CAD元素的相关数据。目前制作参考学习,仅实现端到端之间的通信,具体工具函数尚未晚上

🧩
MCP Crew AI Server

MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.

🧩
Transcription Tools

An MCP server providing intelligent transcript processing capabilities, featuring natural formatting, contextual repair, and smart summarization powered by...

🧩
JIRA

Featured Templates

View More

Start your free trial

Build your solution today. No credit card required.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.