MCP Server: Elasticsearch semantic search tool
Demo repo for: https://j.blaszyk.me/tech-blog/mcp-server-elasticsearch-semantic-search/
Table of Contents
- Overview
- Running the MCP Server
- Integrating with Claude Desktop
- Crawling Search Labs Blog Posts
- 1. Verify Crawler Setup
- 2. Configure Elasticsearch
- 3. Update Index Mapping for Semantic Search
- 4. Start Crawling
- 5. Verify Indexed Documents
Overview
This repository provides a Python implementation of an MCP server for semantic search through Search Labs blog posts indexed in Elasticsearch.
It assumes you’ve crawled the blog posts and stored them in the search-labs-posts index using Elastic Open Crawler.
Running the MCP Server
Add ES_URL and ES_AP_KEY into .env file, (take a look here for generating api key with minimum permissions)
Start the server in MCP Inspector:
make dev
Once running, access the MCP Inspector at: http://localhost:5173
Integrating with Claude Desktop
To add the MCP server to Claude Desktop:
make install-claude-config
This updates claude_desktop_config.json in your home directory. On the next restart, the Claude app will detect the server and load the declared tool.
Crawling Search Labs Blog Posts
1. Verify Crawler Setup
To check if the Elastic Open Crawler works, run:
docker run --rm
--entrypoint /bin/bash
-v "$(pwd)/crawler-config:/app/config"
--network host
docker.elastic.co/integrations/crawler:latest
-c "bin/crawler crawl config/test-crawler.yml"
This should print crawled content from a single page.
2. Configure Elasticsearch
Set up Elasticsearch URL and API Key.
Generate an API key with minimum crawler permissions:
POST /_security/api_key
{
"name": "crawler-search-labs",
"role_descriptors": {
"crawler-search-labs-role": {
"cluster": ["monitor"],
"indices": [
{
"names": ["search-labs-posts"],
"privileges": ["all"]
}
]
}
},
"metadata": {
"application": "crawler"
}
}
Copy the encoded value from the response and set it as API_KEY.
3. Update Index Mapping for Semantic Search
Ensure the search-labs-posts index exists. If not, create it:
PUT search-labs-posts
Update the mapping to enable semantic search:
PUT search-labs-posts/_mappings
{
"properties": {
"body": {
"type": "text",
"copy_to": "semantic_body"
},
"semantic_body": {
"type": "semantic_text",
"inference_id": ".elser-2-elasticsearch"
}
}
}
The body field is indexed as semantic text using Elasticsearch’s ELSER model.
4. Start Crawling
Run the crawler to populate the index:
docker run --rm
--entrypoint /bin/bash
-v "$(pwd)/crawler-config:/app/config"
--network host
docker.elastic.co/integrations/crawler:latest
-c "bin/crawler crawl config/elastic-search-labs-crawler.yml"
[!TIP] If using a fresh Elasticsearch cluster, wait for the ELSER model to start before indexing.
5. Verify Indexed Documents
Check if the documents were indexed:
GET search-labs-posts/_count
This will return the total document count in the index. You can also verify in Kibana.
Done! You can now perform semantic searches on Search Labs blog posts
Elasticsearch semantic search tool
Project Details
- jedrazb/elastic-semantic-search-mcp-server
- Last Updated: 3/28/2025
Recomended MCP Servers
An MCP Server implementation that integrates the Balldontlie API, to provide information about players, teams and games for...
A bridge between Unity and AI assistants using the Model Context Protocol (MCP)
"Liquidium MCP with Posthog Support"
🧠 MCP server implementing RAT (Retrieval Augmented Thinking) - combines DeepSeek's reasoning with GPT-4/Claude/Mistral responses, maintaining conversation context...
使用flux-schnell实现的文生图mcp
A model context protocol server for your Gmail
Model Context Protocol server for Audius. Perform market research, purchase premium tracks, upload songs, and much more!
A Model Context Protocol (MCP) server that provides onchain tools for LLMs, allowing them to interact with the...
Convert Any OpenAPI V3 API to MCP Server
This repository contains the source code for a confluence context server, it provides prompts that can be used...
MCP Server including Clients and Agents
Supabase MCP Server enabling Cursor & Windsurf to use any method from Management API and query your database





