Earthdata MCP Server is a Model Context Protocol (MCP) server implementation that provides tools to interact with NASA Earth Data. It enables efficient dataset discovery, retrieval and analysis for Geospatial analysis.
🚀 NEW: This server now includes all Jupyter MCP Server tools through composition, providing a unified interface for both Earth data discovery and analysis in Jupyter Notebooks.
- Efficient Data Retrieval: Search and download Earthdata datasets
- Unified Interface: Combines Earthdata research and Jupyter notebook manipulation tools for analysis
The following demo uses this MCP server to search for datasets and data granules on NASA Earthdata, download the data in Jupyter and run further analysis.
For comprehensive setup instructions—including Streamable HTTP
transport and advanced configuration—check out the Jupyter MCP Server documentation. Or, get started quickly with JupyterLab
and stdio
transport here below.
pip install jupyterlab==4.4.1 jupyter-collaboration==4.0.2 ipykernel
pip uninstall -y pycrdt datalayer_pycrdt
pip install datalayer_pycrdt==0.12.17
# make jupyterlab
jupyter lab --port 8888 --IdentityProvider.token MY_TOKEN --ip 0.0.0.0
Note
Ensure the port
of the DOCUMENT_URL
and RUNTIME_URL
match those used in the jupyter lab
command.
The DOCUMENT_ID
which is the path to the notebook you want to connect to, should be relative to the directory where JupyterLab was started.
In a basic setup, DOCUMENT_URL
and RUNTIME_URL
are the same. DOCUMENT_TOKEN
, and RUNTIME_TOKEN
are also the same and is actually the Jupyter Token.
Note
The EARTHDATA_USERNAME
and EARTHDATA_PASSWORD
environment variables are used for NASA Earthdata authentication to download datasets via the earthaccess
library. See NASA Earthdata Authentication for more details.
{
"mcpServers": {
"earthdata": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"DOCUMENT_URL",
"-e",
"DOCUMENT_TOKEN",
"-e",
"DOCUMENT_ID",
"-e",
"RUNTIME_URL",
"-e",
"RUNTIME_TOKEN",
"datalayer/earthdata-mcp-server:latest"
],
"env": {
"DOCUMENT_URL": "http://host.docker.internal:8888",
"DOCUMENT_TOKEN": "MY_TOKEN",
"DOCUMENT_ID": "notebook.ipynb",
"RUNTIME_URL": "http://host.docker.internal:8888",
"RUNTIME_TOKEN": "MY_TOKEN",
"EARTHDATA_USERNAME": "your_username",
"EARTHDATA_PASSWORD": "your_password"
}
}
}
}
{
"mcpServers": {
"earthdata": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"DOCUMENT_URL",
"-e",
"DOCUMENT_TOKEN",
"-e",
"DOCUMENT_ID",
"-e",
"RUNTIME_URL",
"-e",
"RUNTIME_TOKEN",
"--network=host",
"datalayer/earthdata-mcp-server:latest"
],
"env": {
"DOCUMENT_URL": "http://localhost:8888",
"DOCUMENT_TOKEN": "MY_TOKEN",
"DOCUMENT_ID": "notebook.ipynb",
"RUNTIME_URL": "http://localhost:8888",
"RUNTIME_TOKEN": "MY_TOKEN",
"EARTHDATA_USERNAME": "your_username",
"EARTHDATA_PASSWORD": "your_password"
}
}
}
}
The server offers 15 tools total: 3 Earthdata-specific tools plus 12 Jupyter notebook manipulation tools.
- Search for datasets on NASA Earthdata.
- Input:
- search_keywords (str): Keywords to search for in the dataset titles.
- count (int): Number of datasets to return.
- temporal (tuple): (Optional) Temporal range in the format (date_from, date_to).
- bounding_box (tuple): (Optional) Bounding box in the format (lower_left_lon, lower_left_lat, upper_right_lon, upper_right_lat).
- Returns: List of dataset abstracts.
- Search for data granules on NASA Earthdata.
- Input:
- short_name (str): Short name of the dataset.
- count (int): Number of data granules to return.
- temporal (tuple): (Optional) Temporal range in the format (date_from, date_to).
- bounding_box (tuple): (Optional) Bounding box in the format (lower_left_lon, lower_left_lat, upper_right_lon, upper_right_lat).
- Returns: List of data granules.
- Download Earth data granules from NASA Earth Data and integrate with Jupyter notebooks.
- This tool combines earthdata search capabilities with jupyter notebook manipulation to create a seamless download workflow.
- Authentication: Requires NASA Earthdata Login credentials (see Authentication section)
- Input:
- folder_name (str): Local folder name to save the data.
- short_name (str): Short name of the Earth dataset to download.
- count (int): Number of data granules to download.
- temporal (tuple): (Optional) Temporal range in the format (date_from, date_to).
- bounding_box (tuple): (Optional) Bounding box in the format (lower_left_lon, lower_left_lat, upper_right_lon, upper_right_lat).
- Returns: Success message with download code preparation details.
The following Jupyter notebook manipulation tools are available:
append_markdown_cell
: Add markdown cells to notebooksinsert_markdown_cell
: Insert markdown cells at specific positionsoverwrite_cell_source
: Modify existing cell contentappend_execute_code_cell
: Add and execute code cellsinsert_execute_code_cell
: Insert and execute code cells at specific positionsexecute_cell_with_progress
: Execute cells with progress monitoringexecute_cell_simple_timeout
: Execute cells with timeoutexecute_cell_streaming
: Execute cells with streaming outputread_all_cells
: Read all notebook cellsread_cell
: Read specific notebook cellsget_notebook_info
: Get notebook metadatadelete_cell
: Delete notebook cells
For detailed documentation of the Jupyter tools, see the Jupyter MCP Server documentation.
-
download_analyze_global_sea_level
🆕- Generate a comprehensive workflow for downloading and analyzing Global Mean Sea Level Trend dataset.
- Uses both earthdata download tools and jupyter analysis capabilities.
- Returns: Detailed prompt for complete sea level analysis workflow.
-
sealevel_rise_dataset
- Search for datasets related to sea level rise worldwide.
- Input:
start_year
(int): Start year to consider.end_year
(int): End year to consider.
- Returns: Prompt correctly formatted.
-
ask_datasets_format
- To ask about the format of the datasets.
- Returns: Prompt correctly formatted.
# or run `docker build -t datalayer/earthdata-mcp-server .`
make build-docker
If you prefer, you can pull the prebuilt images.
make pull-docker