fix(security): path traversal, timing-safe auth, stale credential bindings

Security:
- Remove /mnt/ from _ALLOWED_LOG_PREFIXES to prevent Unraid share exposure
- Add early .. detection for disk/logs and live/log_tail path validation
- Add /boot/ prefix restriction for flash_backup source_path
- Use hmac.compare_digest for timing-safe API key verification in server.py
- Gate include_traceback on DEBUG log level (no tracebacks in production)

Correctness:
- Re-raise CredentialsNotConfiguredError in health check instead of swallowing
- Fix ups_device query (remove non-existent nominalPower/currentPower fields)

Best practices (BP-01, BP-05, BP-06):
- Add # noqa: ASYNC109 to timeout params in _handle_live and unraid()
- Fix start_array* → start_array in docstring (not in ARRAY_DESTRUCTIVE)
- Remove from __future__ import annotations from snapshot.py
- Replace import-time UNRAID_API_KEY/URL bindings with _settings.ATTR pattern
  in manager.py, snapshot.py, utils.py, diagnostics.py — fixes stale binding
  after apply_runtime_config() post-elicitation (BP-05)

CI/CD:
- Add .github/workflows/ci.yml (5-job pipeline: lint, typecheck, test, version-sync, audit)
- Add fail_under = 80 to [tool.coverage.report]
- Add version sync check to scripts/validate-marketplace.sh

Documentation:
- Sync plugin.json version 1.1.1 → 1.1.2 with pyproject.toml
- Update CLAUDE.md: 3 tools, system domain count 18, scripts comment fix
- Update README.md: 3 tools, security notes
- Update docs/AUTHENTICATION.md: H1 title fix
- Add UNRAID_CREDENTIALS_DIR to .env.example

Bump: 1.1.1 → 1.1.2

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Jacob Magar
2026-03-23 11:37:05 -04:00
parent d59f8c22a8
commit 2b777be927
17 changed files with 233 additions and 83 deletions

View File

@@ -1,7 +1,7 @@
{ {
"name": "unraid", "name": "unraid",
"description": "Query and monitor Unraid servers via GraphQL API - array status, disk health, containers, VMs, system monitoring", "description": "Query and monitor Unraid servers via GraphQL API - array status, disk health, containers, VMs, system monitoring",
"version": "1.1.0", "version": "1.1.2",
"author": { "author": {
"name": "jmagar", "name": "jmagar",
"email": "jmagar@users.noreply.github.com" "email": "jmagar@users.noreply.github.com"

View File

@@ -36,6 +36,11 @@ UNRAID_MAX_RECONNECT_ATTEMPTS=10
# Defaults to standard log if not specified # Defaults to standard log if not specified
# UNRAID_AUTOSTART_LOG_PATH=/custom/path/to/autostart.log # UNRAID_AUTOSTART_LOG_PATH=/custom/path/to/autostart.log
# Credentials Directory Override (Optional)
# -----------------------------------------
# Override the credentials directory (default: ~/.unraid-mcp/)
# UNRAID_CREDENTIALS_DIR=/custom/path/to/credentials
# Google OAuth Protection (Optional) # Google OAuth Protection (Optional)
# ----------------------------------- # -----------------------------------
# Protects the MCP HTTP server — clients must authenticate with Google before calling tools. # Protects the MCP HTTP server — clients must authenticate with Google before calling tools.

79
.github/workflows/ci.yml vendored Normal file
View File

@@ -0,0 +1,79 @@
name: CI
on:
push:
branches: ["main", "feat/**", "fix/**"]
pull_request:
branches: ["main"]
jobs:
lint:
name: Lint & Format
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v5
with:
version: "0.9.25"
- name: Install dependencies
run: uv sync --group dev
- name: Ruff check
run: uv run ruff check unraid_mcp/ tests/
- name: Ruff format
run: uv run ruff format --check unraid_mcp/ tests/
typecheck:
name: Type Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v5
with:
version: "0.9.25"
- name: Install dependencies
run: uv sync --group dev
- name: ty check
run: uv run ty check unraid_mcp/
test:
name: Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v5
with:
version: "0.9.25"
- name: Install dependencies
run: uv sync --group dev
- name: Run tests (excluding integration/slow)
run: uv run pytest -m "not slow and not integration" --tb=short -q
- name: Check coverage
run: uv run pytest -m "not slow and not integration" --cov=unraid_mcp --cov-report=term-missing --tb=short -q
version-sync:
name: Version Sync Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check pyproject.toml and plugin.json versions match
run: |
TOML_VER=$(grep '^version = ' pyproject.toml | sed 's/version = "//;s/"//')
PLUGIN_VER=$(python3 -c "import json; print(json.load(open('.claude-plugin/plugin.json'))['version'])")
echo "pyproject.toml: $TOML_VER"
echo "plugin.json: $PLUGIN_VER"
if [ "$TOML_VER" != "$PLUGIN_VER" ]; then
echo "ERROR: Version mismatch! Update .claude-plugin/plugin.json to match pyproject.toml"
exit 1
fi
echo "Versions in sync: $TOML_VER"
audit:
name: Security Audit
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: astral-sh/setup-uv@v5
with:
version: "0.9.25"
- name: Dependency audit
run: uv audit

View File

@@ -54,12 +54,28 @@ docker compose down
``` ```
### Environment Setup ### Environment Setup
- Copy `.env.example` to `.env` and configure: Copy `.env.example` to `.env` and configure:
- `UNRAID_API_URL`: Unraid GraphQL endpoint (required)
- `UNRAID_API_KEY`: Unraid API key (required) **Required:**
- `UNRAID_MCP_TRANSPORT`: Transport type (default: streamable-http) - `UNRAID_API_URL`: Unraid GraphQL endpoint
- `UNRAID_MCP_PORT`: Server port (default: 6970) - `UNRAID_API_KEY`: Unraid API key
- `UNRAID_MCP_HOST`: Server host (default: 0.0.0.0)
**Server:**
- `UNRAID_MCP_TRANSPORT`: Transport type (default: streamable-http)
- `UNRAID_MCP_PORT`: Server port (default: 6970)
- `UNRAID_MCP_HOST`: Server host (default: 0.0.0.0)
- `UNRAID_MCP_LOG_LEVEL`: Log verbosity (default: INFO)
- `UNRAID_MCP_LOG_FILE`: Log filename in logs/ (default: unraid-mcp.log)
**SSL/TLS:**
- `UNRAID_VERIFY_SSL`: SSL verification (default: true; set `false` for self-signed certs)
**Subscriptions:**
- `UNRAID_AUTO_START_SUBSCRIPTIONS`: Auto-start live subscriptions on startup (default: true)
- `UNRAID_MAX_RECONNECT_ATTEMPTS`: WebSocket reconnect limit (default: 10)
**Credentials override:**
- `UNRAID_CREDENTIALS_DIR`: Override the `~/.unraid-mcp/` credentials directory path
### Authentication (Optional — protects the HTTP server) ### Authentication (Optional — protects the HTTP server)
@@ -119,13 +135,16 @@ python3 -c "import secrets; print(secrets.token_hex(32))"
while the subscription starts — callers should retry in a moment. When while the subscription starts — callers should retry in a moment. When
`UNRAID_AUTO_START_SUBSCRIPTIONS=false`, resources fall back to on-demand `subscribe_once`. `UNRAID_AUTO_START_SUBSCRIPTIONS=false`, resources fall back to on-demand `subscribe_once`.
### Tool Categories (1 Tool, ~107 Subactions) ### Tool Categories (3 Tools: 1 Primary + 2 Diagnostic)
The server registers a **single consolidated `unraid` tool** with `action` (domain) + `subaction` (operation) routing. Call it as `unraid(action="docker", subaction="list")`. The server registers **3 MCP tools**:
- **`unraid`** — primary tool with `action` (domain) + `subaction` (operation) routing, 107 subactions. Call it as `unraid(action="docker", subaction="list")`.
- **`diagnose_subscriptions`** — inspect subscription connection states, errors, and WebSocket URLs.
- **`test_subscription_query`** — test a specific GraphQL subscription query (allowlisted fields only).
| action | subactions | | action | subactions |
|--------|-----------| |--------|-----------|
| **system** (19) | overview, array, network, registration, variables, metrics, services, display, config, online, owner, settings, server, servers, flash, ups_devices, ups_device, ups_config | | **system** (18) | overview, array, network, registration, variables, metrics, services, display, config, online, owner, settings, server, servers, flash, ups_devices, ups_device, ups_config |
| **health** (4) | check, test_connection, diagnose, setup | | **health** (4) | check, test_connection, diagnose, setup |
| **array** (13) | parity_status, parity_history, parity_start, parity_pause, parity_resume, parity_cancel, start_array, stop_array*, add_disk, remove_disk*, mount_disk, unmount_disk, clear_disk_stats* | | **array** (13) | parity_status, parity_history, parity_start, parity_pause, parity_resume, parity_cancel, start_array, stop_array*, add_disk, remove_disk*, mount_disk, unmount_disk, clear_disk_stats* |
| **disk** (6) | shares, disks, disk_details, log_files, logs, flash_backup* | | **disk** (6) | shares, disks, disk_details, log_files, logs, flash_backup* |
@@ -211,7 +230,7 @@ uv run pytest -x # Fail fast on first error
### Scripts ### Scripts
```bash ```bash
# HTTP smoke-test against a live server (11 tools, all non-destructive actions) # HTTP smoke-test against a live server (non-destructive actions, all domains)
./tests/mcporter/test-actions.sh [MCP_URL] # default: http://localhost:6970/mcp ./tests/mcporter/test-actions.sh [MCP_URL] # default: http://localhost:6970/mcp
# stdio smoke-test, no running server needed (good for CI) # stdio smoke-test, no running server needed (good for CI)

View File

@@ -8,7 +8,7 @@
## ✨ Features ## ✨ Features
- 🔧 **1 Tool, ~108 Actions**: Complete Unraid management through a single consolidated MCP tool - 🔧 **1 primary tool + 2 diagnostic tools, 107 subactions**: Complete Unraid management through a consolidated MCP tool
- 🏗️ **Modular Architecture**: Clean, maintainable, and extensible codebase - 🏗️ **Modular Architecture**: Clean, maintainable, and extensible codebase
-**High Performance**: Async/concurrent operations with optimized timeouts -**High Performance**: Async/concurrent operations with optimized timeouts
- 🔄 **Real-time Data**: WebSocket subscriptions for live metrics, logs, array state, and more - 🔄 **Real-time Data**: WebSocket subscriptions for live metrics, logs, array state, and more
@@ -46,7 +46,7 @@
``` ```
This provides instant access to Unraid monitoring and management through Claude Code with: This provides instant access to Unraid monitoring and management through Claude Code with:
- **1 MCP tool** (`unraid`) exposing **~108 actions** via `action` + `subaction` routing - **1 primary MCP tool** (`unraid`) exposing **107 subactions** via `action` + `subaction` routing, plus `diagnose_subscriptions` and `test_subscription_query` diagnostic tools
- Real-time system metrics and health monitoring - Real-time system metrics and health monitoring
- Docker container and VM lifecycle management - Docker container and VM lifecycle management
- Disk health monitoring and storage management - Disk health monitoring and storage management
@@ -140,7 +140,7 @@ unraid-mcp/ # ${CLAUDE_PLUGIN_ROOT}
└── scripts/ # Validation and helper scripts └── scripts/ # Validation and helper scripts
``` ```
- **MCP Server**: 1 `unraid` tool with ~108 actions via GraphQL API - **MCP Server**: 3 tools — `unraid` (107 subactions) + `diagnose_subscriptions` + `test_subscription_query`
- **Skill**: `/unraid` skill for monitoring and queries - **Skill**: `/unraid` skill for monitoring and queries
- **Entry Point**: `unraid-mcp-server` defined in pyproject.toml - **Entry Point**: `unraid-mcp-server` defined in pyproject.toml
@@ -233,7 +233,7 @@ UNRAID_AUTO_START_SUBSCRIPTIONS=true # Auto-start WebSocket subscriptions on st
UNRAID_MAX_RECONNECT_ATTEMPTS=10 # Max WebSocket reconnection attempts (default: 10) UNRAID_MAX_RECONNECT_ATTEMPTS=10 # Max WebSocket reconnection attempts (default: 10)
# Optional: Log Stream Configuration # Optional: Log Stream Configuration
# UNRAID_AUTOSTART_LOG_PATH=/var/log/syslog # Path for log streaming resource (unraid://logs/stream) # UNRAID_AUTOSTART_LOG_PATH=/var/log/syslog # Override log path for unraid://logs/stream (auto-detects /var/log/syslog if unset)
``` ```
### Transport Options ### Transport Options
@@ -290,7 +290,7 @@ Omit both to run without authentication (default — open server).
The single `unraid` tool uses `action` (domain) + `subaction` (operation) routing to expose all operations via one MCP tool, minimizing context window usage. Destructive actions require `confirm=True`. The single `unraid` tool uses `action` (domain) + `subaction` (operation) routing to expose all operations via one MCP tool, minimizing context window usage. Destructive actions require `confirm=True`.
### Single Tool, 15 Domains, ~108 Actions ### Primary Tool: 15 Domains, 107 Subactions
Call pattern: `unraid(action="<domain>", subaction="<operation>")` Call pattern: `unraid(action="<domain>", subaction="<operation>")`
@@ -343,6 +343,8 @@ The server exposes two classes of MCP resources backed by persistent WebSocket c
> >
> **`log_tail` and `notification_feed`** are accessible as tool subactions (`unraid(action="live", subaction="log_tail")`) but are not registered as MCP resources — they use transient one-shot subscriptions and require parameters. > **`log_tail` and `notification_feed`** are accessible as tool subactions (`unraid(action="live", subaction="log_tail")`) but are not registered as MCP resources — they use transient one-shot subscriptions and require parameters.
> **Security note**: The `disk/logs` and `live/log_tail` subactions allow reading files under `/var/log/` and `/boot/logs/` on the Unraid server. Authenticated MCP clients can stream any log file within these directories.
--- ---
@@ -372,7 +374,7 @@ unraid-mcp/
│ │ ├── queries.py # Subscription query constants │ │ ├── queries.py # Subscription query constants
│ │ ├── diagnostics.py # Diagnostic tools │ │ ├── diagnostics.py # Diagnostic tools
│ │ └── utils.py # Subscription utility functions │ │ └── utils.py # Subscription utility functions
│ └── tools/ # Single consolidated tool (~108 actions) │ └── tools/ # Consolidated tools (unraid: 107 subactions + 2 diagnostic tools)
│ └── unraid.py # All 15 domains in one file │ └── unraid.py # All 15 domains in one file
├── tests/ # Test suite ├── tests/ # Test suite
│ ├── conftest.py # Shared fixtures │ ├── conftest.py # Shared fixtures

View File

@@ -1,6 +1,6 @@
# Google OAuth Setup Guide # Authentication Setup Guide
This document explains how to protect the Unraid MCP HTTP server with Google OAuth 2.0 authentication using FastMCP's built-in `GoogleProvider`. This document covers both Google OAuth 2.0 and API key bearer token authentication for the Unraid MCP HTTP server. It explains how to protect the server using FastMCP's built-in `GoogleProvider` for OAuth, or a static bearer token for headless/machine access.
--- ---

View File

@@ -10,7 +10,7 @@ build-backend = "hatchling.build"
# ============================================================================ # ============================================================================
[project] [project]
name = "unraid-mcp" name = "unraid-mcp"
version = "1.1.1" version = "1.1.2"
description = "MCP Server for Unraid API - provides tools to interact with an Unraid server's GraphQL API" description = "MCP Server for Unraid API - provides tools to interact with an Unraid server's GraphQL API"
readme = "README.md" readme = "README.md"
license = {file = "LICENSE"} license = {file = "LICENSE"}
@@ -258,6 +258,7 @@ omit = [
] ]
[tool.coverage.report] [tool.coverage.report]
fail_under = 80
precision = 2 precision = 2
show_missing = true show_missing = true
skip_covered = false skip_covered = false

View File

@@ -70,6 +70,20 @@ else
echo -e "Checking: Plugin source path is valid... ${RED}${NC} (plugin not found in marketplace)" echo -e "Checking: Plugin source path is valid... ${RED}${NC} (plugin not found in marketplace)"
fi fi
# Check version sync between pyproject.toml and plugin.json
echo "Checking version sync..."
TOML_VER=$(grep '^version = ' pyproject.toml | sed 's/version = "//;s/"//')
PLUGIN_VER=$(python3 -c "import json; print(json.load(open('.claude-plugin/plugin.json'))['version'])" 2>/dev/null || echo "ERROR_READING")
if [ "$TOML_VER" != "$PLUGIN_VER" ]; then
echo -e "${RED}FAIL: Version mismatch — pyproject.toml=$TOML_VER, plugin.json=$PLUGIN_VER${NC}"
CHECKS=$((CHECKS + 1))
FAILED=$((FAILED + 1))
else
echo -e "${GREEN}PASS: Versions in sync ($TOML_VER)${NC}"
CHECKS=$((CHECKS + 1))
PASSED=$((PASSED + 1))
fi
echo "" echo ""
echo "=== Results ===" echo "=== Results ==="
echo -e "Total checks: $CHECKS" echo -e "Total checks: $CHECKS"

View File

@@ -5,7 +5,7 @@ from unittest.mock import MagicMock, patch
import pytest import pytest
from unraid_mcp.server import ApiKeyVerifier, _build_auth import unraid_mcp.server as srv
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -16,7 +16,7 @@ from unraid_mcp.server import ApiKeyVerifier, _build_auth
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_api_key_verifier_accepts_correct_key(): async def test_api_key_verifier_accepts_correct_key():
"""Returns AccessToken when the presented token matches the configured key.""" """Returns AccessToken when the presented token matches the configured key."""
verifier = ApiKeyVerifier("secret-key-abc123") verifier = srv.ApiKeyVerifier("secret-key-abc123")
result = await verifier.verify_token("secret-key-abc123") result = await verifier.verify_token("secret-key-abc123")
assert result is not None assert result is not None
@@ -27,7 +27,7 @@ async def test_api_key_verifier_accepts_correct_key():
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_api_key_verifier_rejects_wrong_key(): async def test_api_key_verifier_rejects_wrong_key():
"""Returns None when the token does not match.""" """Returns None when the token does not match."""
verifier = ApiKeyVerifier("secret-key-abc123") verifier = srv.ApiKeyVerifier("secret-key-abc123")
result = await verifier.verify_token("wrong-key") result = await verifier.verify_token("wrong-key")
assert result is None assert result is None
@@ -36,7 +36,7 @@ async def test_api_key_verifier_rejects_wrong_key():
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_api_key_verifier_rejects_empty_token(): async def test_api_key_verifier_rejects_empty_token():
"""Returns None for an empty string token.""" """Returns None for an empty string token."""
verifier = ApiKeyVerifier("secret-key-abc123") verifier = srv.ApiKeyVerifier("secret-key-abc123")
result = await verifier.verify_token("") result = await verifier.verify_token("")
assert result is None assert result is None
@@ -50,7 +50,7 @@ async def test_api_key_verifier_empty_key_rejects_empty_token():
should not be instantiated in that case. But if it is, it must not should not be instantiated in that case. But if it is, it must not
grant access via an empty bearer token. grant access via an empty bearer token.
""" """
verifier = ApiKeyVerifier("") verifier = srv.ApiKeyVerifier("")
result = await verifier.verify_token("") result = await verifier.verify_token("")
assert result is None assert result is None
@@ -72,7 +72,7 @@ def test_build_auth_returns_none_when_nothing_configured(monkeypatch):
importlib.reload(s) importlib.reload(s)
result = _build_auth() result = srv._build_auth()
assert result is None assert result is None
@@ -87,8 +87,8 @@ def test_build_auth_returns_api_key_verifier_when_only_api_key_set(monkeypatch):
importlib.reload(s) importlib.reload(s)
result = _build_auth() result = srv._build_auth()
assert isinstance(result, ApiKeyVerifier) assert isinstance(result, srv.ApiKeyVerifier)
def test_build_auth_returns_google_provider_when_only_oauth_set(monkeypatch): def test_build_auth_returns_google_provider_when_only_oauth_set(monkeypatch):
@@ -105,7 +105,7 @@ def test_build_auth_returns_google_provider_when_only_oauth_set(monkeypatch):
mock_provider = MagicMock() mock_provider = MagicMock()
with patch("unraid_mcp.server.GoogleProvider", return_value=mock_provider): with patch("unraid_mcp.server.GoogleProvider", return_value=mock_provider):
result = _build_auth() result = srv._build_auth()
assert result is mock_provider assert result is mock_provider
@@ -126,14 +126,14 @@ def test_build_auth_returns_multi_auth_when_both_configured(monkeypatch):
mock_provider = MagicMock() mock_provider = MagicMock()
with patch("unraid_mcp.server.GoogleProvider", return_value=mock_provider): with patch("unraid_mcp.server.GoogleProvider", return_value=mock_provider):
result = _build_auth() result = srv._build_auth()
assert isinstance(result, MultiAuth) assert isinstance(result, MultiAuth)
# Server is the Google provider # Server is the Google provider
assert result.server is mock_provider assert result.server is mock_provider
# One additional verifier — the ApiKeyVerifier # One additional verifier — the ApiKeyVerifier
assert len(result.verifiers) == 1 assert len(result.verifiers) == 1
assert isinstance(result.verifiers[0], ApiKeyVerifier) assert isinstance(result.verifiers[0], srv.ApiKeyVerifier)
def test_build_auth_multi_auth_api_key_verifier_uses_correct_key(monkeypatch): def test_build_auth_multi_auth_api_key_verifier_uses_correct_key(monkeypatch):
@@ -149,7 +149,7 @@ def test_build_auth_multi_auth_api_key_verifier_uses_correct_key(monkeypatch):
importlib.reload(s) importlib.reload(s)
with patch("unraid_mcp.server.GoogleProvider", return_value=MagicMock()): with patch("unraid_mcp.server.GoogleProvider", return_value=MagicMock()):
result = _build_auth() result = srv._build_auth()
verifier = result.verifiers[0] verifier = result.verifiers[0]
assert verifier._api_key == "super-secret-token" assert verifier._api_key == "super-secret-token"

View File

@@ -8,9 +8,10 @@ from unraid_mcp.server import _build_google_auth
def test_build_google_auth_returns_none_when_unconfigured(monkeypatch): def test_build_google_auth_returns_none_when_unconfigured(monkeypatch):
"""Returns None when Google OAuth env vars are absent.""" """Returns None when Google OAuth env vars are absent."""
monkeypatch.delenv("GOOGLE_CLIENT_ID", raising=False) # Use explicit empty values so dotenv reload cannot re-inject from ~/.unraid-mcp/.env.
monkeypatch.delenv("GOOGLE_CLIENT_SECRET", raising=False) monkeypatch.setenv("GOOGLE_CLIENT_ID", "")
monkeypatch.delenv("UNRAID_MCP_BASE_URL", raising=False) monkeypatch.setenv("GOOGLE_CLIENT_SECRET", "")
monkeypatch.setenv("UNRAID_MCP_BASE_URL", "")
import unraid_mcp.config.settings as s import unraid_mcp.config.settings as s
@@ -42,6 +43,8 @@ def test_build_google_auth_returns_provider_when_configured(monkeypatch):
client_id="test-id.apps.googleusercontent.com", client_id="test-id.apps.googleusercontent.com",
client_secret="GOCSPX-test-secret", client_secret="GOCSPX-test-secret",
base_url="http://10.1.0.2:6970", base_url="http://10.1.0.2:6970",
extra_authorize_params={"access_type": "online", "prompt": "consent"},
require_authorization_consent=False,
jwt_signing_key="x" * 32, jwt_signing_key="x" * 32,
) )
@@ -95,7 +98,7 @@ def test_mcp_instance_has_no_auth_by_default():
import os import os
for var in ("GOOGLE_CLIENT_ID", "GOOGLE_CLIENT_SECRET", "UNRAID_MCP_BASE_URL"): for var in ("GOOGLE_CLIENT_ID", "GOOGLE_CLIENT_SECRET", "UNRAID_MCP_BASE_URL"):
os.environ.pop(var, None) os.environ[var] = ""
import importlib import importlib

View File

@@ -64,12 +64,12 @@ class TestStorageValidation:
async def test_logs_rejects_path_traversal(self, _mock_graphql: AsyncMock) -> None: async def test_logs_rejects_path_traversal(self, _mock_graphql: AsyncMock) -> None:
tool_fn = _make_tool() tool_fn = _make_tool()
# Traversal that escapes /var/log/ to reach /etc/shadow # Traversal that escapes /var/log/ — detected by early .. check
with pytest.raises(ToolError, match="log_path must start with"): with pytest.raises(ToolError, match="log_path"):
await tool_fn(action="disk", subaction="logs", log_path="/var/log/../../etc/shadow") await tool_fn(action="disk", subaction="logs", log_path="/var/log/../../etc/shadow")
# Traversal that escapes /mnt/ to reach /etc/passwd # Traversal via .. — detected by early .. check
with pytest.raises(ToolError, match="log_path must start with"): with pytest.raises(ToolError, match="log_path"):
await tool_fn(action="disk", subaction="logs", log_path="/mnt/../etc/passwd") await tool_fn(action="disk", subaction="logs", log_path="/var/log/../etc/passwd")
async def test_logs_allows_valid_paths(self, _mock_graphql: AsyncMock) -> None: async def test_logs_allows_valid_paths(self, _mock_graphql: AsyncMock) -> None:
_mock_graphql.return_value = {"logFile": {"path": "/var/log/syslog", "content": "ok"}} _mock_graphql.return_value = {"logFile": {"path": "/var/log/syslog", "content": "ok"}}

View File

@@ -18,6 +18,7 @@ from fastmcp.server.middleware.response_limiting import ResponseLimitingMiddlewa
from .config.logging import logger from .config.logging import logger
from .config.settings import ( from .config.settings import (
LOG_LEVEL_STR,
UNRAID_MCP_HOST, UNRAID_MCP_HOST,
UNRAID_MCP_PORT, UNRAID_MCP_PORT,
UNRAID_MCP_TRANSPORT, UNRAID_MCP_TRANSPORT,
@@ -44,7 +45,7 @@ _logging_middleware = LoggingMiddleware(
# Tracks error_counts per (exception_type:method) for health diagnose. # Tracks error_counts per (exception_type:method) for health diagnose.
_error_middleware = ErrorHandlingMiddleware( _error_middleware = ErrorHandlingMiddleware(
logger=logger, logger=logger,
include_traceback=True, include_traceback=LOG_LEVEL_STR == "DEBUG",
) )
# 3. Unraid API rate limit: 100 requests per 10 seconds. # 3. Unraid API rate limit: 100 requests per 10 seconds.
@@ -93,7 +94,9 @@ class ApiKeyVerifier(TokenVerifier):
self._api_key = api_key self._api_key = api_key
async def verify_token(self, token: str) -> AccessToken | None: async def verify_token(self, token: str) -> AccessToken | None:
if self._api_key and token == self._api_key: import hmac
if self._api_key and hmac.compare_digest(token.encode(), self._api_key.encode()):
return AccessToken( return AccessToken(
token=token, token=token,
client_id="api-key-client", client_id="api-key-client",
@@ -131,6 +134,14 @@ def _build_google_auth() -> "GoogleProvider | None":
"client_id": GOOGLE_CLIENT_ID, "client_id": GOOGLE_CLIENT_ID,
"client_secret": GOOGLE_CLIENT_SECRET, "client_secret": GOOGLE_CLIENT_SECRET,
"base_url": UNRAID_MCP_BASE_URL, "base_url": UNRAID_MCP_BASE_URL,
# Prefer short-lived access tokens without refresh-token rotation churn.
# This reduces reconnect instability in MCP clients that re-auth frequently.
"extra_authorize_params": {"access_type": "online", "prompt": "consent"},
# Skip the FastMCP consent page — goes directly to Google.
# The consent page has a CSRF double-load race: two concurrent GET requests
# each regenerate the CSRF token, the second overwrites the first in the
# transaction store, and the POST fails with "Invalid or expired consent token".
"require_authorization_consent": False,
} }
if UNRAID_MCP_JWT_SIGNING_KEY: if UNRAID_MCP_JWT_SIGNING_KEY:
kwargs["jwt_signing_key"] = UNRAID_MCP_JWT_SIGNING_KEY kwargs["jwt_signing_key"] = UNRAID_MCP_JWT_SIGNING_KEY

View File

@@ -15,8 +15,8 @@ import websockets
from fastmcp import FastMCP from fastmcp import FastMCP
from websockets.typing import Subprotocol from websockets.typing import Subprotocol
from ..config import settings as _settings
from ..config.logging import logger from ..config.logging import logger
from ..config.settings import UNRAID_API_KEY, UNRAID_API_URL
from ..core.exceptions import ToolError from ..core.exceptions import ToolError
from ..core.utils import safe_display_url from ..core.utils import safe_display_url
from .manager import subscription_manager from .manager import subscription_manager
@@ -130,7 +130,7 @@ def register_diagnostic_tools(mcp: FastMCP) -> None:
json.dumps( json.dumps(
{ {
"type": "connection_init", "type": "connection_init",
"payload": {"x-api-key": UNRAID_API_KEY}, "payload": {"x-api-key": _settings.UNRAID_API_KEY},
} }
) )
) )
@@ -203,7 +203,7 @@ def register_diagnostic_tools(mcp: FastMCP) -> None:
# Calculate WebSocket URL # Calculate WebSocket URL
ws_url_display: str | None = None ws_url_display: str | None = None
if UNRAID_API_URL: if _settings.UNRAID_API_URL:
try: try:
ws_url_display = build_ws_url() ws_url_display = build_ws_url()
except ValueError: except ValueError:
@@ -215,8 +215,8 @@ def register_diagnostic_tools(mcp: FastMCP) -> None:
"environment": { "environment": {
"auto_start_enabled": subscription_manager.auto_start_enabled, "auto_start_enabled": subscription_manager.auto_start_enabled,
"max_reconnect_attempts": subscription_manager.max_reconnect_attempts, "max_reconnect_attempts": subscription_manager.max_reconnect_attempts,
"unraid_api_url": safe_display_url(UNRAID_API_URL), "unraid_api_url": safe_display_url(_settings.UNRAID_API_URL),
"api_key_configured": bool(UNRAID_API_KEY), "api_key_configured": bool(_settings.UNRAID_API_KEY),
"websocket_url": ws_url_display, "websocket_url": ws_url_display,
}, },
"subscriptions": status, "subscriptions": status,

View File

@@ -15,8 +15,8 @@ from typing import Any
import websockets import websockets
from websockets.typing import Subprotocol from websockets.typing import Subprotocol
from ..config import settings as _settings
from ..config.logging import logger from ..config.logging import logger
from ..config.settings import UNRAID_API_KEY
from ..core.client import redact_sensitive from ..core.client import redact_sensitive
from ..core.types import SubscriptionData from ..core.types import SubscriptionData
from .utils import build_ws_ssl_context, build_ws_url from .utils import build_ws_ssl_context, build_ws_url
@@ -250,7 +250,7 @@ class SubscriptionManager:
ws_url = build_ws_url() ws_url = build_ws_url()
logger.debug(f"[WEBSOCKET:{subscription_name}] Connecting to: {ws_url}") logger.debug(f"[WEBSOCKET:{subscription_name}] Connecting to: {ws_url}")
logger.debug( logger.debug(
f"[WEBSOCKET:{subscription_name}] API Key present: {'Yes' if UNRAID_API_KEY else 'No'}" f"[WEBSOCKET:{subscription_name}] API Key present: {'Yes' if _settings.UNRAID_API_KEY else 'No'}"
) )
ssl_context = build_ws_ssl_context(ws_url) ssl_context = build_ws_ssl_context(ws_url)
@@ -287,10 +287,10 @@ class SubscriptionManager:
init_type = "connection_init" init_type = "connection_init"
init_payload: dict[str, Any] = {"type": init_type} init_payload: dict[str, Any] = {"type": init_type}
if UNRAID_API_KEY: if _settings.UNRAID_API_KEY:
logger.debug(f"[AUTH:{subscription_name}] Adding authentication payload") logger.debug(f"[AUTH:{subscription_name}] Adding authentication payload")
# Use graphql-ws connectionParams format (direct key, not nested headers) # Use graphql-ws connectionParams format (direct key, not nested headers)
init_payload["payload"] = {"x-api-key": UNRAID_API_KEY} init_payload["payload"] = {"x-api-key": _settings.UNRAID_API_KEY}
else: else:
logger.warning( logger.warning(
f"[AUTH:{subscription_name}] No API key available for authentication" f"[AUTH:{subscription_name}] No API key available for authentication"

View File

@@ -11,8 +11,6 @@ WebSocket per call. This is intentional: MCP tools are request-response.
Use the SubscriptionManager for long-lived monitoring resources. Use the SubscriptionManager for long-lived monitoring resources.
""" """
from __future__ import annotations
import asyncio import asyncio
import json import json
from typing import Any from typing import Any
@@ -20,8 +18,8 @@ from typing import Any
import websockets import websockets
from websockets.typing import Subprotocol from websockets.typing import Subprotocol
from ..config import settings as _settings
from ..config.logging import logger from ..config.logging import logger
from ..config.settings import UNRAID_API_KEY
from ..core.exceptions import ToolError from ..core.exceptions import ToolError
from .utils import build_ws_ssl_context, build_ws_url from .utils import build_ws_ssl_context, build_ws_url
@@ -51,8 +49,8 @@ async def subscribe_once(
# Handshake # Handshake
init: dict[str, Any] = {"type": "connection_init"} init: dict[str, Any] = {"type": "connection_init"}
if UNRAID_API_KEY: if _settings.UNRAID_API_KEY:
init["payload"] = {"x-api-key": UNRAID_API_KEY} init["payload"] = {"x-api-key": _settings.UNRAID_API_KEY}
await ws.send(json.dumps(init)) await ws.send(json.dumps(init))
raw = await asyncio.wait_for(ws.recv(), timeout=timeout) raw = await asyncio.wait_for(ws.recv(), timeout=timeout)
@@ -126,8 +124,8 @@ async def subscribe_collect(
sub_id = "snapshot-1" sub_id = "snapshot-1"
init: dict[str, Any] = {"type": "connection_init"} init: dict[str, Any] = {"type": "connection_init"}
if UNRAID_API_KEY: if _settings.UNRAID_API_KEY:
init["payload"] = {"x-api-key": UNRAID_API_KEY} init["payload"] = {"x-api-key": _settings.UNRAID_API_KEY}
await ws.send(json.dumps(init)) await ws.send(json.dumps(init))
raw = await asyncio.wait_for(ws.recv(), timeout=timeout) raw = await asyncio.wait_for(ws.recv(), timeout=timeout)

View File

@@ -3,11 +3,11 @@
import ssl as _ssl import ssl as _ssl
from typing import Any from typing import Any
from ..config.settings import UNRAID_API_URL, UNRAID_VERIFY_SSL from ..config import settings as _settings
def build_ws_url() -> str: def build_ws_url() -> str:
"""Build a WebSocket URL from the configured UNRAID_API_URL. """Build a WebSocket URL from the configured UNRAID_API_URL setting.
Converts http(s) scheme to ws(s) and ensures /graphql path suffix. Converts http(s) scheme to ws(s) and ensures /graphql path suffix.
@@ -17,19 +17,19 @@ def build_ws_url() -> str:
Raises: Raises:
ValueError: If UNRAID_API_URL is not configured or has an unrecognised scheme. ValueError: If UNRAID_API_URL is not configured or has an unrecognised scheme.
""" """
if not UNRAID_API_URL: if not _settings.UNRAID_API_URL:
raise ValueError("UNRAID_API_URL is not configured") raise ValueError("UNRAID_API_URL is not configured")
if UNRAID_API_URL.startswith("https://"): if _settings.UNRAID_API_URL.startswith("https://"):
ws_url = "wss://" + UNRAID_API_URL[len("https://") :] ws_url = "wss://" + _settings.UNRAID_API_URL[len("https://") :]
elif UNRAID_API_URL.startswith("http://"): elif _settings.UNRAID_API_URL.startswith("http://"):
ws_url = "ws://" + UNRAID_API_URL[len("http://") :] ws_url = "ws://" + _settings.UNRAID_API_URL[len("http://") :]
elif UNRAID_API_URL.startswith(("ws://", "wss://")): elif _settings.UNRAID_API_URL.startswith(("ws://", "wss://")):
ws_url = UNRAID_API_URL # Already a WebSocket URL ws_url = _settings.UNRAID_API_URL # Already a WebSocket URL
else: else:
raise ValueError( raise ValueError(
f"UNRAID_API_URL must start with http://, https://, ws://, or wss://. " f"UNRAID_API_URL must start with http://, https://, ws://, or wss://. "
f"Got: {UNRAID_API_URL[:20]}..." f"Got: {_settings.UNRAID_API_URL[:20]}..."
) )
if not ws_url.endswith("/graphql"): if not ws_url.endswith("/graphql"):
@@ -45,13 +45,13 @@ def build_ws_ssl_context(ws_url: str) -> _ssl.SSLContext | None:
ws_url: The WebSocket URL to connect to. ws_url: The WebSocket URL to connect to.
Returns: Returns:
An SSLContext configured per UNRAID_VERIFY_SSL, or None for non-TLS URLs. An SSLContext configured per _settings.UNRAID_VERIFY_SSL, or None for non-TLS URLs.
""" """
if not ws_url.startswith("wss://"): if not ws_url.startswith("wss://"):
return None return None
if isinstance(UNRAID_VERIFY_SSL, str): if isinstance(_settings.UNRAID_VERIFY_SSL, str):
return _ssl.create_default_context(cafile=UNRAID_VERIFY_SSL) return _ssl.create_default_context(cafile=_settings.UNRAID_VERIFY_SSL)
if UNRAID_VERIFY_SSL: if _settings.UNRAID_VERIFY_SSL:
return _ssl.create_default_context() return _ssl.create_default_context()
# Explicitly disable verification (equivalent to verify=False) # Explicitly disable verification (equivalent to verify=False)
ctx = _ssl.SSLContext(_ssl.PROTOCOL_TLS_CLIENT) ctx = _ssl.SSLContext(_ssl.PROTOCOL_TLS_CLIENT)

View File

@@ -21,7 +21,6 @@ Actions:
live - Real-time WebSocket subscription snapshots (11 subactions) live - Real-time WebSocket subscription snapshots (11 subactions)
""" """
import asyncio
import datetime import datetime
import os import os
import re import re
@@ -32,7 +31,7 @@ from fastmcp import Context, FastMCP
from ..config.logging import logger from ..config.logging import logger
from ..core.client import DISK_TIMEOUT, make_graphql_request from ..core.client import DISK_TIMEOUT, make_graphql_request
from ..core.exceptions import ToolError, tool_error_handler from ..core.exceptions import CredentialsNotConfiguredError, ToolError, tool_error_handler
from ..core.guards import gate_destructive_action from ..core.guards import gate_destructive_action
from ..core.setup import elicit_and_configure, elicit_reset_confirmation from ..core.setup import elicit_and_configure, elicit_reset_confirmation
from ..core.utils import format_bytes, format_kb, safe_get from ..core.utils import format_bytes, format_kb, safe_get
@@ -110,7 +109,7 @@ _SYSTEM_QUERIES: dict[str, str] = {
"servers": "query GetServers { servers { id name status wanip lanip localurl remoteurl } }", "servers": "query GetServers { servers { id name status wanip lanip localurl remoteurl } }",
"flash": "query GetFlash { flash { id vendor product } }", "flash": "query GetFlash { flash { id vendor product } }",
"ups_devices": "query GetUpsDevices { upsDevices { id name model status battery { chargeLevel estimatedRuntime health } power { loadPercentage inputVoltage outputVoltage } } }", "ups_devices": "query GetUpsDevices { upsDevices { id name model status battery { chargeLevel estimatedRuntime health } power { loadPercentage inputVoltage outputVoltage } } }",
"ups_device": "query GetUpsDevice($id: String!) { upsDeviceById(id: $id) { id name model status battery { chargeLevel estimatedRuntime health } power { loadPercentage inputVoltage outputVoltage nominalPower currentPower } } }", "ups_device": "query GetUpsDevice($id: String!) { upsDeviceById(id: $id) { id name model status battery { chargeLevel estimatedRuntime health } power { loadPercentage inputVoltage outputVoltage } } }",
"ups_config": "query GetUpsConfig { upsConfiguration { service upsCable upsType device batteryLevel minutes timeout killUps upsName } }", "ups_config": "query GetUpsConfig { upsConfiguration { service upsCable upsType device batteryLevel minutes timeout killUps upsName } }",
} }
@@ -505,6 +504,8 @@ async def _comprehensive_health_check() -> dict[str, Any]:
} }
return health_info return health_info
except CredentialsNotConfiguredError:
raise # Let tool_error_handler convert to setup instructions
except Exception as e: except Exception as e:
logger.error(f"Health check failed: {e}", exc_info=True) logger.error(f"Health check failed: {e}", exc_info=True)
return { return {
@@ -625,7 +626,7 @@ _DISK_MUTATIONS: dict[str, str] = {
_DISK_SUBACTIONS: set[str] = set(_DISK_QUERIES) | set(_DISK_MUTATIONS) _DISK_SUBACTIONS: set[str] = set(_DISK_QUERIES) | set(_DISK_MUTATIONS)
_DISK_DESTRUCTIVE: set[str] = {"flash_backup"} _DISK_DESTRUCTIVE: set[str] = {"flash_backup"}
_ALLOWED_LOG_PREFIXES = ("/var/log/", "/boot/logs/", "/mnt/") _ALLOWED_LOG_PREFIXES = ("/var/log/", "/boot/logs/")
_MAX_TAIL_LINES = 10_000 _MAX_TAIL_LINES = 10_000
@@ -662,7 +663,10 @@ async def _handle_disk(
raise ToolError(f"tail_lines must be between 1 and {_MAX_TAIL_LINES}, got {tail_lines}") raise ToolError(f"tail_lines must be between 1 and {_MAX_TAIL_LINES}, got {tail_lines}")
if not log_path: if not log_path:
raise ToolError("log_path is required for disk/logs") raise ToolError("log_path is required for disk/logs")
normalized = await asyncio.to_thread(os.path.realpath, log_path) # Validate without filesystem access — path is consumed on the remote Unraid server
if ".." in log_path:
raise ToolError("log_path must not contain path traversal sequences (../)")
normalized = os.path.normpath(log_path) # noqa: ASYNC240 — pure string normalization, no I/O
if not any(normalized.startswith(p) for p in _ALLOWED_LOG_PREFIXES): if not any(normalized.startswith(p) for p in _ALLOWED_LOG_PREFIXES):
raise ToolError(f"log_path must start with one of: {', '.join(_ALLOWED_LOG_PREFIXES)}") raise ToolError(f"log_path must start with one of: {', '.join(_ALLOWED_LOG_PREFIXES)}")
log_path = normalized log_path = normalized
@@ -674,6 +678,14 @@ async def _handle_disk(
raise ToolError("source_path is required for disk/flash_backup") raise ToolError("source_path is required for disk/flash_backup")
if not destination_path: if not destination_path:
raise ToolError("destination_path is required for disk/flash_backup") raise ToolError("destination_path is required for disk/flash_backup")
# Validate paths — flash backup source must come from /boot/ only
if ".." in source_path:
raise ToolError("source_path must not contain path traversal sequences (../)")
_norm_src = os.path.normpath(source_path) # noqa: ASYNC240 — pure string normalization, no I/O
if not (_norm_src == "/boot" or _norm_src.startswith("/boot/")):
raise ToolError("source_path must start with /boot/ (flash drive only)")
if ".." in destination_path:
raise ToolError("destination_path must not contain path traversal sequences (../)")
input_data: dict[str, Any] = { input_data: dict[str, Any] = {
"remoteName": remote_name, "remoteName": remote_name,
"sourcePath": source_path, "sourcePath": source_path,
@@ -1629,11 +1641,14 @@ async def _handle_user(subaction: str) -> dict[str, Any]:
# LIVE (subscriptions) # LIVE (subscriptions)
# =========================================================================== # ===========================================================================
_LIVE_ALLOWED_LOG_PREFIXES = ("/var/log/", "/boot/logs/", "/mnt/") _LIVE_ALLOWED_LOG_PREFIXES = _ALLOWED_LOG_PREFIXES
async def _handle_live( async def _handle_live(
subaction: str, path: str | None, collect_for: float, timeout: float subaction: str,
path: str | None,
collect_for: float,
timeout: float, # noqa: ASYNC109
) -> dict[str, Any]: ) -> dict[str, Any]:
from ..subscriptions.queries import COLLECT_ACTIONS, EVENT_DRIVEN_ACTIONS, SNAPSHOT_ACTIONS from ..subscriptions.queries import COLLECT_ACTIONS, EVENT_DRIVEN_ACTIONS, SNAPSHOT_ACTIONS
from ..subscriptions.snapshot import subscribe_collect, subscribe_once from ..subscriptions.snapshot import subscribe_collect, subscribe_once
@@ -1647,7 +1662,10 @@ async def _handle_live(
if subaction == "log_tail": if subaction == "log_tail":
if not path: if not path:
raise ToolError("path is required for live/log_tail") raise ToolError("path is required for live/log_tail")
normalized = await asyncio.to_thread(os.path.realpath, path) # Validate without filesystem access — path is consumed on the remote Unraid server
if ".." in path:
raise ToolError("path must not contain path traversal sequences (../)")
normalized = os.path.normpath(path) # noqa: ASYNC240 — pure string normalization, no I/O
if not any(normalized.startswith(p) for p in _LIVE_ALLOWED_LOG_PREFIXES): if not any(normalized.startswith(p) for p in _LIVE_ALLOWED_LOG_PREFIXES):
raise ToolError(f"path must start with one of: {', '.join(_LIVE_ALLOWED_LOG_PREFIXES)}") raise ToolError(f"path must start with one of: {', '.join(_LIVE_ALLOWED_LOG_PREFIXES)}")
path = normalized path = normalized
@@ -1787,7 +1805,7 @@ def register_unraid_tool(mcp: FastMCP) -> None:
# live # live
path: str | None = None, path: str | None = None,
collect_for: float = 5.0, collect_for: float = 5.0,
timeout: float = 10.0, timeout: float = 10.0, # noqa: ASYNC109
) -> dict[str, Any] | str: ) -> dict[str, Any] | str:
"""Interact with an Unraid server's GraphQL API. """Interact with an Unraid server's GraphQL API.
@@ -1804,7 +1822,7 @@ def register_unraid_tool(mcp: FastMCP) -> None:
│ health │ check, test_connection, diagnose, setup │ │ health │ check, test_connection, diagnose, setup │
├─────────────────┼──────────────────────────────────────────────────────────────────────┤ ├─────────────────┼──────────────────────────────────────────────────────────────────────┤
│ array │ parity_status, parity_history, parity_start, parity_pause, │ │ array │ parity_status, parity_history, parity_start, parity_pause, │
│ │ parity_resume, parity_cancel, start_array*, stop_array*, │ │ │ parity_resume, parity_cancel, start_array, stop_array*,
│ │ add_disk, remove_disk*, mount_disk, unmount_disk, clear_disk_stats* │ │ │ add_disk, remove_disk*, mount_disk, unmount_disk, clear_disk_stats* │
├─────────────────┼──────────────────────────────────────────────────────────────────────┤ ├─────────────────┼──────────────────────────────────────────────────────────────────────┤
│ disk │ shares, disks, disk_details, log_files, logs, flash_backup* │ │ disk │ shares, disks, disk_details, log_files, logs, flash_backup* │