Model Context Protocol (MCP) for NASA Earthdata Search (CMR)
This module is a model context protocol (MCP) for NASAβs earthdata common metedata repository (CMR). The goal of this MCP server is to integrate AI retrievals with NASA Catalog of datasets by way of Earthaccess.
Dependencies
uv - a rust based python package manager a LLM client, such as Claude desktop or chatGPT desktop (for consuming the MCP)
Install and Run
Clone the repository to your local environment, or where your LLM client is running.
git clone https://github.com/podaac/cmr-mcp.git
cd cmr-mcp
Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv
source .venv/bin/activate
Install packages with uv
uv sync
use the outputs of which uv
(UV_LIB) and PWD
(CMR_MCP_INSTALL) to update the following configuration.
Adding to AI Framework
In this example weβll use Claude desktop.
Update the claude_desktop_config.json
file (sometimes this must be created). On a mac, this is often found in ~/Library/Application Support/Claude/claude_desktop_config.json
Add the following configuration, filling in the values of UV_LIB and CMR_MCP_INSTALL - donβt use environment variables here.
{
"mcpServers": {
"cmr": {
"command": "$UV_LIB$",
"args": [
"--directory",
"$CMR_MCP_INSTALL$",
"run",
"cmr-search.py"
]
}
}
}
Use the MCP Server
Simply prompt your agent to search cmr for...
data. Below is a simple example of this in action.
Other prompts that can work:
- Search CMR for datasets from 2024 to 2025
- Search CMR for PO.DAAC datasets from 2020 to 2024 with keyword Climate
CMR Model Context Protocol
Project Details
- podaac/cmr-mcp
- Last Updated: 4/15/2025
Recomended MCP Servers
An MCP server that can work with Claude desktop to fetch documentation from langchain, llama-index, and OpenAI.
GitHub MCP server for managing GitHub repositories and organizations
Model Context Protocol Server for PocketBase
Codebase indexing MCP server
αα«ααα±αα ααΈααα²αα - Gikendaasowin Aabajichiganan - Cognitive Tools MCP server implemented from various prompting strategies.
A lightweight service that enables AI assistants to execute AWS CLI commands (in safe containerized environment) through the...
An MCP server paired with a browser extension that enables LLM clients to control the user's browser.
Provides summarised output from various actions that could otherwise eat up tokens and cause crashes for AI agents
Implementation of Model Context Protocol server for Mailgun APIs
Connect AI assistants to your ERPNext instance via the Model Context Protocol (MCP) using the official Frappe API.
Evaluate if a task requires human intervention