feat: migrate storage from JSON files to SQLite
Some checks failed
Lint & Test / test (push) Failing after 28s

Replace 22 individual JSON store files with a single SQLite database
(data/ledgrab.db). All entity stores now use BaseSqliteStore backed by
SQLite with WAL mode, write-through caching, and thread-safe access.

- Add Database class with SQLite backup/restore API
- Add BaseSqliteStore as drop-in replacement for BaseJsonStore
- Convert all 16 entity stores to SQLite
- Move global settings (MQTT, external URL, auto-backup) to SQLite
  settings table
- Replace JSON backup/restore with SQLite snapshot backups (.db files)
- Remove partial export/import feature (backend + frontend)
- Update demo seed to write directly to SQLite
- Add "Backup Now" button to settings UI
- Remove StorageConfig file path fields (single database_file remains)
This commit is contained in:
2026-03-25 00:03:19 +03:00
parent 29fb944494
commit 9dfd2365f4
38 changed files with 941 additions and 880 deletions

81
TODO.md
View File

@@ -1,47 +1,42 @@
# Weather Source Implementation # SQLite Migration
## Phase 1: Backend — Entity & Provider ## Phase 1: Infrastructure
- [x] Create `storage/database.py` — SQLite connection wrapper (WAL mode, thread-safe)
- [x] Create `storage/base_sqlite_store.py` — same public API as BaseJsonStore, backed by SQLite
- [x] Create `storage/migration.py` — auto-migrate JSON files to SQLite on first run
- [x] Add `database_file` to `StorageConfig` in config.py
- [x] Update demo mode path rewriting for database_file
- [x] `storage/weather_source.py` — WeatherSource dataclass ## Phase 2: Convert stores (one-by-one)
- [x] `storage/weather_source_store.py` — BaseJsonStore, CRUD, ID prefix `ws_` - [x] SyncClockStore
- [x] `api/schemas/weather_sources.py` — Create/Update/Response Pydantic models - [x] GradientStore
- [x] `api/routes/weather_sources.py` — REST CRUD + `POST /{id}/test` endpoint - [x] WeatherSourceStore
- [x] `core/weather/weather_provider.py` — WeatherData, WeatherProvider ABC, OpenMeteoProvider, WMO_CONDITION_NAMES - [x] AutomationStore
- [x] `core/weather/weather_manager.py` — Ref-counted runtime pool, polls API, caches WeatherData - [x] ScenePresetStore
- [x] `config.py` — Add `weather_sources_file` to StorageConfig - [x] TemplateStore
- [x] `main.py` — Init store + manager, inject dependencies, shutdown save - [x] PostprocessingTemplateStore
- [x] `api/__init__.py` — Register router - [x] PatternTemplateStore
- [x] `api/routes/backup.py` — Add to STORE_MAP - [x] AudioTemplateStore
- [x] ColorStripProcessingTemplateStore
- [x] PictureSourceStore
- [x] AudioSourceStore
- [x] ValueSourceStore
- [x] DeviceStore
- [x] OutputTargetStore
- [x] ColorStripStore
## Phase 2: Backend — CSS Stream ## Phase 3: Update backup/restore
- [x] Refactor backup.py to read from SQLite (export/import/backup/restore)
- [x] Keep JSON backup format identical for compatibility
- [x] Update AutoBackupEngine to read from SQLite
- [x] Add Database to dependency injection
- [x] `core/processing/weather_stream.py` — WeatherColorStripStream with WMO palette mapping + temperature shift + thunderstorm flash ## Phase 4: Cleanup
- [x] `core/processing/color_strip_stream_manager.py` — Register `"weather"` stream type + weather_manager dependency - [ ] Remove individual `*_file` fields from StorageConfig (keep `database_file` only)
- [x] `storage/color_strip_source.py` — WeatherColorStripSource dataclass + registry - [ ] Remove `atomic_write_json` usage from stores (still used by auto_backup settings)
- [x] `api/schemas/color_strip_sources.py` — Add `"weather"` to Literal + weather_source_id, temperature_influence fields - [ ] Remove `freeze_saves` from base_store (only `freeze_writes` needed)
- [x] `core/processing/processor_manager.py` — Pass weather_manager through ProcessorDependencies - [ ] Remove BaseJsonStore (keep EntityNotFoundError — move to shared location)
- [ ] Update _save_all_stores to use _save_all() instead of _save(force=True)
## Phase 3: Frontend — Weather Source Entity - [ ] Update CLAUDE.md and server/CLAUDE.md documentation
- [ ] Remove `_json_key`/`_legacy_json_keys` references from old code
- [x] `templates/modals/weather-source-editor.html` — Modal with provider select, lat/lon + "Use my location", update interval, test button - [ ] Clean up test files to use Database fixture instead of file paths
- [x] `static/js/features/weather-sources.ts` — Modal, CRUD, test (shows weather toast), clone, geolocation, CardSection delegation
- [x] `static/js/core/state.ts` — weatherSourcesCache + _cachedWeatherSources
- [x] `static/js/types.ts` — WeatherSource interface + ColorStripSource weather fields
- [x] `static/js/features/streams.ts` — Weather Sources CardSection + card renderer + tree nav
- [x] `templates/index.html` — Include modal template
- [x] `static/css/modal.css` — Weather location row styles
## Phase 4: Frontend — CSS Editor Integration
- [x] `static/js/features/color-strips.ts``"weather"` type, section map, handler, card renderer, populate dropdown
- [x] `static/js/core/icons.ts` — Weather icon in CSS type icons
- [x] `templates/modals/css-editor.html` — Weather section (EntitySelect for weather source, speed, temperature_influence)
## Phase 5: i18n + Build
- [x] `static/locales/en.json` — Weather source + CSS editor keys
- [x] `static/locales/ru.json` — Russian translations
- [x] `static/locales/zh.json` — Chinese translations
- [x] Lint: `ruff check` — passed
- [x] Build: `tsc --noEmit` + `npm run build` — passed
- [ ] Restart server + test

View File

@@ -15,12 +15,7 @@ auth:
dev: "development-key-change-in-production" dev: "development-key-change-in-production"
storage: storage:
devices_file: "data/devices.json" database_file: "data/ledgrab.db"
templates_file: "data/capture_templates.json"
postprocessing_templates_file: "data/postprocessing_templates.json"
picture_sources_file: "data/picture_sources.json"
output_targets_file: "data/output_targets.json"
pattern_templates_file: "data/pattern_templates.json"
mqtt: mqtt:
enabled: false enabled: false

View File

@@ -19,12 +19,7 @@ auth:
demo: "demo" demo: "demo"
storage: storage:
devices_file: "data/devices.json" database_file: "data/ledgrab.db"
templates_file: "data/capture_templates.json"
postprocessing_templates_file: "data/postprocessing_templates.json"
picture_sources_file: "data/picture_sources.json"
output_targets_file: "data/output_targets.json"
pattern_templates_file: "data/pattern_templates.json"
mqtt: mqtt:
enabled: false enabled: false

View File

@@ -11,12 +11,7 @@ auth:
test_client: "eb8a89cfd33ab067751fd0e38f74ddf7ac3d75ff012fbab35a616c45c12e0c8d" test_client: "eb8a89cfd33ab067751fd0e38f74ddf7ac3d75ff012fbab35a616c45c12e0c8d"
storage: storage:
devices_file: "data/test_devices.json" database_file: "data/test_ledgrab.db"
templates_file: "data/capture_templates.json"
postprocessing_templates_file: "data/postprocessing_templates.json"
picture_sources_file: "data/picture_sources.json"
output_targets_file: "data/output_targets.json"
pattern_templates_file: "data/pattern_templates.json"
logging: logging:
format: "text" format: "text"

View File

@@ -7,6 +7,7 @@ All getter function signatures remain unchanged for FastAPI Depends() compatibil
from typing import Any, Dict, TypeVar from typing import Any, Dict, TypeVar
from wled_controller.core.processing.processor_manager import ProcessorManager from wled_controller.core.processing.processor_manager import ProcessorManager
from wled_controller.storage.database import Database
from wled_controller.storage import DeviceStore from wled_controller.storage import DeviceStore
from wled_controller.storage.template_store import TemplateStore from wled_controller.storage.template_store import TemplateStore
from wled_controller.storage.postprocessing_template_store import PostprocessingTemplateStore from wled_controller.storage.postprocessing_template_store import PostprocessingTemplateStore
@@ -129,6 +130,10 @@ def get_weather_manager() -> WeatherManager:
return _get("weather_manager", "Weather manager") return _get("weather_manager", "Weather manager")
def get_database() -> Database:
return _get("database", "Database")
# ── Event helper ──────────────────────────────────────────────────────── # ── Event helper ────────────────────────────────────────────────────────
@@ -157,6 +162,7 @@ def init_dependencies(
device_store: DeviceStore, device_store: DeviceStore,
template_store: TemplateStore, template_store: TemplateStore,
processor_manager: ProcessorManager, processor_manager: ProcessorManager,
database: Database | None = None,
pp_template_store: PostprocessingTemplateStore | None = None, pp_template_store: PostprocessingTemplateStore | None = None,
pattern_template_store: PatternTemplateStore | None = None, pattern_template_store: PatternTemplateStore | None = None,
picture_source_store: PictureSourceStore | None = None, picture_source_store: PictureSourceStore | None = None,
@@ -178,6 +184,7 @@ def init_dependencies(
): ):
"""Initialize global dependencies.""" """Initialize global dependencies."""
_deps.update({ _deps.update({
"database": database,
"device_store": device_store, "device_store": device_store,
"template_store": template_store, "template_store": template_store,
"processor_manager": processor_manager, "processor_manager": processor_manager,

View File

@@ -1,23 +1,20 @@
"""System routes: backup, restore, export, import, auto-backup. """System routes: backup, restore, auto-backup.
Extracted from system.py to keep files under 800 lines. All backups are SQLite database snapshots (.db files).
""" """
import asyncio import asyncio
import io import io
import json
import subprocess import subprocess
import sys import sys
import threading import threading
from datetime import datetime, timezone
from pathlib import Path from pathlib import Path
from fastapi import APIRouter, Depends, File, HTTPException, Query, UploadFile from fastapi import APIRouter, Depends, File, HTTPException, UploadFile
from fastapi.responses import StreamingResponse from fastapi.responses import StreamingResponse
from wled_controller import __version__
from wled_controller.api.auth import AuthRequired from wled_controller.api.auth import AuthRequired
from wled_controller.api.dependencies import get_auto_backup_engine from wled_controller.api.dependencies import get_auto_backup_engine, get_database
from wled_controller.api.schemas.system import ( from wled_controller.api.schemas.system import (
AutoBackupSettings, AutoBackupSettings,
AutoBackupStatusResponse, AutoBackupStatusResponse,
@@ -26,38 +23,13 @@ from wled_controller.api.schemas.system import (
RestoreResponse, RestoreResponse,
) )
from wled_controller.core.backup.auto_backup import AutoBackupEngine from wled_controller.core.backup.auto_backup import AutoBackupEngine
from wled_controller.config import get_config from wled_controller.storage.database import Database, freeze_writes
from wled_controller.storage.base_store import freeze_saves from wled_controller.utils import get_logger
from wled_controller.utils import atomic_write_json, get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
router = APIRouter() router = APIRouter()
# ---------------------------------------------------------------------------
# Configuration backup / restore
# ---------------------------------------------------------------------------
# Mapping: logical store name -> StorageConfig attribute name
STORE_MAP = {
"devices": "devices_file",
"capture_templates": "templates_file",
"postprocessing_templates": "postprocessing_templates_file",
"picture_sources": "picture_sources_file",
"output_targets": "output_targets_file",
"pattern_templates": "pattern_templates_file",
"color_strip_sources": "color_strip_sources_file",
"audio_sources": "audio_sources_file",
"audio_templates": "audio_templates_file",
"value_sources": "value_sources_file",
"sync_clocks": "sync_clocks_file",
"color_strip_processing_templates": "color_strip_processing_templates_file",
"automations": "automations_file",
"scene_presets": "scene_presets_file",
"gradients": "gradients_file",
"weather_sources": "weather_sources_file",
}
_SERVER_DIR = Path(__file__).resolve().parents[4] _SERVER_DIR = Path(__file__).resolve().parents[4]
@@ -82,150 +54,77 @@ def _schedule_restart() -> None:
threading.Thread(target=_restart, daemon=True).start() threading.Thread(target=_restart, daemon=True).start()
@router.get("/api/v1/system/export/{store_key}", tags=["System"]) # ---------------------------------------------------------------------------
def export_store(store_key: str, _: AuthRequired): # Backup / restore (SQLite snapshots)
"""Download a single entity store as a JSON file.""" # ---------------------------------------------------------------------------
if store_key not in STORE_MAP:
raise HTTPException(
status_code=404,
detail=f"Unknown store '{store_key}'. Valid keys: {sorted(STORE_MAP.keys())}",
)
config = get_config()
file_path = Path(getattr(config.storage, STORE_MAP[store_key]))
if file_path.exists():
with open(file_path, "r", encoding="utf-8") as f:
data = json.load(f)
else:
data = {}
export = {
"meta": {
"format": "ledgrab-partial-export",
"format_version": 1,
"store_key": store_key,
"app_version": __version__,
"created_at": datetime.now(timezone.utc).isoformat() + "Z",
},
"store": data,
}
content = json.dumps(export, indent=2, ensure_ascii=False)
timestamp = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H%M%S")
filename = f"ledgrab-{store_key}-{timestamp}.json"
return StreamingResponse(
io.BytesIO(content.encode("utf-8")),
media_type="application/json",
headers={"Content-Disposition": f'attachment; filename="{filename}"'},
)
@router.post("/api/v1/system/import/{store_key}", tags=["System"])
async def import_store(
store_key: str,
_: AuthRequired,
file: UploadFile = File(...),
merge: bool = Query(False, description="Merge into existing data instead of replacing"),
):
"""Upload a partial export file to replace or merge one entity store. Triggers server restart."""
if store_key not in STORE_MAP:
raise HTTPException(
status_code=404,
detail=f"Unknown store '{store_key}'. Valid keys: {sorted(STORE_MAP.keys())}",
)
try:
raw = await file.read()
if len(raw) > 10 * 1024 * 1024:
raise HTTPException(status_code=400, detail="File too large (max 10 MB)")
payload = json.loads(raw)
except json.JSONDecodeError as e:
raise HTTPException(status_code=400, detail=f"Invalid JSON: {e}")
# Support both full-backup format and partial-export format
if "stores" in payload and isinstance(payload.get("meta"), dict):
# Full backup: extract the specific store
if payload["meta"].get("format") not in ("ledgrab-backup",):
raise HTTPException(status_code=400, detail="Not a valid LED Grab backup or partial export file")
stores = payload.get("stores", {})
if store_key not in stores:
raise HTTPException(status_code=400, detail=f"Backup does not contain store '{store_key}'")
incoming = stores[store_key]
elif isinstance(payload.get("meta"), dict) and payload["meta"].get("format") == "ledgrab-partial-export":
# Partial export format
if payload["meta"].get("store_key") != store_key:
raise HTTPException(
status_code=400,
detail=f"File is for store '{payload['meta']['store_key']}', not '{store_key}'",
)
incoming = payload.get("store", {})
else:
raise HTTPException(status_code=400, detail="Not a valid LED Grab backup or partial export file")
if not isinstance(incoming, dict):
raise HTTPException(status_code=400, detail="Store data must be a JSON object")
config = get_config()
file_path = Path(getattr(config.storage, STORE_MAP[store_key]))
def _write():
if merge and file_path.exists():
with open(file_path, "r", encoding="utf-8") as f:
existing = json.load(f)
if isinstance(existing, dict):
existing.update(incoming)
atomic_write_json(file_path, existing)
return len(existing)
atomic_write_json(file_path, incoming)
return len(incoming)
count = await asyncio.to_thread(_write)
freeze_saves()
logger.info(f"Imported store '{store_key}' ({count} entries, merge={merge}). Scheduling restart...")
_schedule_restart()
return {
"status": "imported",
"store_key": store_key,
"entries": count,
"merge": merge,
"restart_scheduled": True,
"message": f"Imported {count} entries for '{store_key}'. Server restarting...",
}
@router.get("/api/v1/system/backup", tags=["System"]) @router.get("/api/v1/system/backup", tags=["System"])
def backup_config(_: AuthRequired): def backup_config(_: AuthRequired, db: Database = Depends(get_database)):
"""Download all configuration as a single JSON backup file.""" """Download a full database backup as a .db file."""
config = get_config() import tempfile
stores = {} with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp:
for store_key, config_attr in STORE_MAP.items(): tmp_path = Path(tmp.name)
file_path = Path(getattr(config.storage, config_attr))
if file_path.exists():
with open(file_path, "r", encoding="utf-8") as f:
stores[store_key] = json.load(f)
else:
stores[store_key] = {}
backup = { try:
"meta": { db.backup_to(tmp_path)
"format": "ledgrab-backup", content = tmp_path.read_bytes()
"format_version": 1, finally:
"app_version": __version__, tmp_path.unlink(missing_ok=True)
"created_at": datetime.now(timezone.utc).isoformat() + "Z",
"store_count": len(stores),
},
"stores": stores,
}
content = json.dumps(backup, indent=2, ensure_ascii=False) from datetime import datetime, timezone
timestamp = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H%M%S") timestamp = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H%M%S")
filename = f"ledgrab-backup-{timestamp}.json" filename = f"ledgrab-backup-{timestamp}.db"
return StreamingResponse( return StreamingResponse(
io.BytesIO(content.encode("utf-8")), io.BytesIO(content),
media_type="application/json", media_type="application/octet-stream",
headers={"Content-Disposition": f'attachment; filename="{filename}"'}, headers={"Content-Disposition": f'attachment; filename="{filename}"'},
) )
@router.post("/api/v1/system/restore", response_model=RestoreResponse, tags=["System"])
async def restore_config(
_: AuthRequired,
file: UploadFile = File(...),
db: Database = Depends(get_database),
):
"""Upload a .db backup file to restore all configuration. Triggers server restart."""
raw = await file.read()
if len(raw) > 50 * 1024 * 1024: # 50 MB limit
raise HTTPException(status_code=400, detail="Backup file too large (max 50 MB)")
if len(raw) < 100:
raise HTTPException(status_code=400, detail="File too small to be a valid SQLite database")
# SQLite files start with "SQLite format 3\000"
if not raw[:16].startswith(b"SQLite format 3"):
raise HTTPException(status_code=400, detail="Not a valid SQLite database file")
import tempfile
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp:
tmp.write(raw)
tmp_path = Path(tmp.name)
try:
def _restore():
db.restore_from(tmp_path)
await asyncio.to_thread(_restore)
finally:
tmp_path.unlink(missing_ok=True)
freeze_writes()
logger.info("Database restored from uploaded backup. Scheduling restart...")
_schedule_restart()
return RestoreResponse(
status="restored",
restart_scheduled=True,
message="Database restored from backup. Server restarting...",
)
@router.post("/api/v1/system/restart", tags=["System"]) @router.post("/api/v1/system/restart", tags=["System"])
def restart_server(_: AuthRequired): def restart_server(_: AuthRequired):
"""Schedule a server restart and return immediately.""" """Schedule a server restart and return immediately."""
@@ -237,110 +136,12 @@ def restart_server(_: AuthRequired):
@router.post("/api/v1/system/shutdown", tags=["System"]) @router.post("/api/v1/system/shutdown", tags=["System"])
def shutdown_server(_: AuthRequired): def shutdown_server(_: AuthRequired):
"""Gracefully shut down the server. """Gracefully shut down the server."""
Signals uvicorn to exit, which triggers the lifespan shutdown handler
(persists all stores to disk, stops processors, etc.).
Used by the restart script to ensure data is saved before the process exits.
"""
from wled_controller.server_ref import request_shutdown from wled_controller.server_ref import request_shutdown
request_shutdown() request_shutdown()
return {"status": "shutting_down"} return {"status": "shutting_down"}
@router.post("/api/v1/system/restore", response_model=RestoreResponse, tags=["System"])
async def restore_config(
_: AuthRequired,
file: UploadFile = File(...),
):
"""Upload a backup file to restore all configuration. Triggers server restart."""
# Read and parse
try:
raw = await file.read()
if len(raw) > 10 * 1024 * 1024: # 10 MB limit
raise HTTPException(status_code=400, detail="Backup file too large (max 10 MB)")
backup = json.loads(raw)
except json.JSONDecodeError as e:
raise HTTPException(status_code=400, detail=f"Invalid JSON file: {e}")
# Validate envelope
meta = backup.get("meta")
if not isinstance(meta, dict) or meta.get("format") != "ledgrab-backup":
raise HTTPException(status_code=400, detail="Not a valid LED Grab backup file")
fmt_version = meta.get("format_version", 0)
if fmt_version > 1:
raise HTTPException(
status_code=400,
detail=f"Backup format version {fmt_version} is not supported by this server version",
)
stores = backup.get("stores")
if not isinstance(stores, dict):
raise HTTPException(status_code=400, detail="Backup file missing 'stores' section")
known_keys = set(STORE_MAP.keys())
present_keys = known_keys & set(stores.keys())
if not present_keys:
raise HTTPException(status_code=400, detail="Backup contains no recognized store data")
for key in present_keys:
if not isinstance(stores[key], dict):
raise HTTPException(status_code=400, detail=f"Store '{key}' in backup is not a valid JSON object")
# Guard: reject backups where every store is empty (version key only, no entities).
# This prevents accidental data wipes from restoring a backup taken when the
# server had no data loaded.
total_entities = 0
for key in present_keys:
store_data = stores[key]
for field_key, field_val in store_data.items():
if field_key != "version" and isinstance(field_val, dict):
total_entities += len(field_val)
if total_entities == 0:
raise HTTPException(
status_code=400,
detail="Backup contains no entity data (all stores are empty). Aborting to prevent data loss.",
)
# Log missing stores as warnings
missing = known_keys - present_keys
if missing:
for store_key in sorted(missing):
logger.warning(f"Restore: backup is missing store '{store_key}' — existing data will be kept")
# Write store files atomically (in thread to avoid blocking event loop)
config = get_config()
def _write_stores():
count = 0
for store_key, config_attr in STORE_MAP.items():
if store_key in stores:
file_path = Path(getattr(config.storage, config_attr))
atomic_write_json(file_path, stores[store_key])
count += 1
logger.info(f"Restored store: {store_key} -> {file_path}")
return count
written = await asyncio.to_thread(_write_stores)
# Freeze all store saves so the old process can't overwrite restored files
# with stale in-memory data before the restart completes.
freeze_saves()
logger.info(f"Restore complete: {written}/{len(STORE_MAP)} stores written. Scheduling restart...")
_schedule_restart()
return RestoreResponse(
status="restored",
stores_written=written,
stores_total=len(STORE_MAP),
missing_stores=sorted(missing) if missing else [],
restart_scheduled=True,
message=f"Restored {written} stores. Server restarting...",
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Auto-backup settings & saved backups # Auto-backup settings & saved backups
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -419,7 +220,7 @@ def download_saved_backup(
content = path.read_bytes() content = path.read_bytes()
return StreamingResponse( return StreamingResponse(
io.BytesIO(content), io.BytesIO(content),
media_type="application/json", media_type="application/octet-stream",
headers={"Content-Disposition": f'attachment; filename="{filename}"'}, headers={"Content-Disposition": f'attachment; filename="{filename}"'},
) )

View File

@@ -45,8 +45,7 @@ from wled_controller.core.capture.screen_capture import get_available_displays
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
from wled_controller.storage.base_store import EntityNotFoundError from wled_controller.storage.base_store import EntityNotFoundError
# Re-export STORE_MAP and load_external_url so existing callers still work # Re-export load_external_url so existing callers still work
from wled_controller.api.routes.backup import STORE_MAP # noqa: F401
from wled_controller.api.routes.system_settings import load_external_url # noqa: F401 from wled_controller.api.routes.system_settings import load_external_url # noqa: F401
logger = get_logger(__name__) logger = get_logger(__name__)

View File

@@ -4,15 +4,14 @@ Extracted from system.py to keep files under 800 lines.
""" """
import asyncio import asyncio
import json
import logging import logging
import re import re
from pathlib import Path
from fastapi import APIRouter, HTTPException, Query, WebSocket, WebSocketDisconnect from fastapi import APIRouter, Depends, HTTPException, Query, WebSocket, WebSocketDisconnect
from pydantic import BaseModel from pydantic import BaseModel
from wled_controller.api.auth import AuthRequired from wled_controller.api.auth import AuthRequired
from wled_controller.api.dependencies import get_database
from wled_controller.api.schemas.system import ( from wled_controller.api.schemas.system import (
ExternalUrlRequest, ExternalUrlRequest,
ExternalUrlResponse, ExternalUrlResponse,
@@ -22,6 +21,7 @@ from wled_controller.api.schemas.system import (
MQTTSettingsResponse, MQTTSettingsResponse,
) )
from wled_controller.config import get_config from wled_controller.config import get_config
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
@@ -33,21 +33,9 @@ router = APIRouter()
# MQTT settings # MQTT settings
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
_MQTT_SETTINGS_FILE: Path | None = None
def _load_mqtt_settings(db: Database) -> dict:
def _get_mqtt_settings_path() -> Path: """Load MQTT settings: YAML config defaults overridden by DB settings."""
global _MQTT_SETTINGS_FILE
if _MQTT_SETTINGS_FILE is None:
cfg = get_config()
# Derive the data directory from any known storage file path
data_dir = Path(cfg.storage.devices_file).parent
_MQTT_SETTINGS_FILE = data_dir / "mqtt_settings.json"
return _MQTT_SETTINGS_FILE
def _load_mqtt_settings() -> dict:
"""Load MQTT settings: YAML config defaults overridden by JSON overrides file."""
cfg = get_config() cfg = get_config()
defaults = { defaults = {
"enabled": cfg.mqtt.enabled, "enabled": cfg.mqtt.enabled,
@@ -58,31 +46,20 @@ def _load_mqtt_settings() -> dict:
"client_id": cfg.mqtt.client_id, "client_id": cfg.mqtt.client_id,
"base_topic": cfg.mqtt.base_topic, "base_topic": cfg.mqtt.base_topic,
} }
path = _get_mqtt_settings_path() overrides = db.get_setting("mqtt")
if path.exists(): if overrides:
try:
with open(path, "r", encoding="utf-8") as f:
overrides = json.load(f)
defaults.update(overrides) defaults.update(overrides)
except Exception as e:
logger.warning(f"Failed to load MQTT settings override file: {e}")
return defaults return defaults
def _save_mqtt_settings(settings: dict) -> None:
"""Persist MQTT settings to the JSON override file."""
from wled_controller.utils import atomic_write_json
atomic_write_json(_get_mqtt_settings_path(), settings)
@router.get( @router.get(
"/api/v1/system/mqtt/settings", "/api/v1/system/mqtt/settings",
response_model=MQTTSettingsResponse, response_model=MQTTSettingsResponse,
tags=["System"], tags=["System"],
) )
async def get_mqtt_settings(_: AuthRequired): async def get_mqtt_settings(_: AuthRequired, db: Database = Depends(get_database)):
"""Get current MQTT broker settings. Password is masked.""" """Get current MQTT broker settings. Password is masked."""
s = _load_mqtt_settings() s = _load_mqtt_settings(db)
return MQTTSettingsResponse( return MQTTSettingsResponse(
enabled=s["enabled"], enabled=s["enabled"],
broker_host=s["broker_host"], broker_host=s["broker_host"],
@@ -99,9 +76,9 @@ async def get_mqtt_settings(_: AuthRequired):
response_model=MQTTSettingsResponse, response_model=MQTTSettingsResponse,
tags=["System"], tags=["System"],
) )
async def update_mqtt_settings(_: AuthRequired, body: MQTTSettingsRequest): async def update_mqtt_settings(_: AuthRequired, body: MQTTSettingsRequest, db: Database = Depends(get_database)):
"""Update MQTT broker settings. If password is empty string, the existing password is preserved.""" """Update MQTT broker settings. If password is empty string, the existing password is preserved."""
current = _load_mqtt_settings() current = _load_mqtt_settings(db)
# If caller sends an empty password, keep the existing one # If caller sends an empty password, keep the existing one
password = body.password if body.password else current.get("password", "") password = body.password if body.password else current.get("password", "")
@@ -115,7 +92,7 @@ async def update_mqtt_settings(_: AuthRequired, body: MQTTSettingsRequest):
"client_id": body.client_id, "client_id": body.client_id,
"base_topic": body.base_topic, "base_topic": body.base_topic,
} }
_save_mqtt_settings(new_settings) db.set_setting("mqtt", new_settings)
logger.info("MQTT settings updated") logger.info("MQTT settings updated")
return MQTTSettingsResponse( return MQTTSettingsResponse(
@@ -133,44 +110,25 @@ async def update_mqtt_settings(_: AuthRequired, body: MQTTSettingsRequest):
# External URL setting # External URL setting
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
_EXTERNAL_URL_FILE: Path | None = None def load_external_url(db: Database | None = None) -> str:
def _get_external_url_path() -> Path:
global _EXTERNAL_URL_FILE
if _EXTERNAL_URL_FILE is None:
cfg = get_config()
data_dir = Path(cfg.storage.devices_file).parent
_EXTERNAL_URL_FILE = data_dir / "external_url.json"
return _EXTERNAL_URL_FILE
def load_external_url() -> str:
"""Load the external URL setting. Returns empty string if not set.""" """Load the external URL setting. Returns empty string if not set."""
path = _get_external_url_path() if db is None:
if path.exists(): from wled_controller.api.dependencies import get_database
try: db = get_database()
with open(path, "r", encoding="utf-8") as f: data = db.get_setting("external_url")
data = json.load(f) if data:
return data.get("external_url", "") return data.get("external_url", "")
except Exception:
pass
return "" return ""
def _save_external_url(url: str) -> None:
from wled_controller.utils import atomic_write_json
atomic_write_json(_get_external_url_path(), {"external_url": url})
@router.get( @router.get(
"/api/v1/system/external-url", "/api/v1/system/external-url",
response_model=ExternalUrlResponse, response_model=ExternalUrlResponse,
tags=["System"], tags=["System"],
) )
async def get_external_url(_: AuthRequired): async def get_external_url(_: AuthRequired, db: Database = Depends(get_database)):
"""Get the configured external base URL.""" """Get the configured external base URL."""
return ExternalUrlResponse(external_url=load_external_url()) return ExternalUrlResponse(external_url=load_external_url(db))
@router.put( @router.put(
@@ -178,10 +136,10 @@ async def get_external_url(_: AuthRequired):
response_model=ExternalUrlResponse, response_model=ExternalUrlResponse,
tags=["System"], tags=["System"],
) )
async def update_external_url(_: AuthRequired, body: ExternalUrlRequest): async def update_external_url(_: AuthRequired, body: ExternalUrlRequest, db: Database = Depends(get_database)):
"""Set the external base URL used in webhook URLs and other user-visible URLs.""" """Set the external base URL used in webhook URLs and other user-visible URLs."""
url = body.external_url.strip().rstrip("/") url = body.external_url.strip().rstrip("/")
_save_external_url(url) db.set_setting("external_url", {"external_url": url})
logger.info("External URL updated: %s", url or "(cleared)") logger.info("External URL updated: %s", url or "(cleared)")
return ExternalUrlResponse(external_url=url) return ExternalUrlResponse(external_url=url)

View File

@@ -75,12 +75,9 @@ class PerformanceResponse(BaseModel):
class RestoreResponse(BaseModel): class RestoreResponse(BaseModel):
"""Response after restoring configuration backup.""" """Response after restoring database backup."""
status: str = Field(description="Status of restore operation") status: str = Field(description="Status of restore operation")
stores_written: int = Field(description="Number of stores successfully written")
stores_total: int = Field(description="Total number of known stores")
missing_stores: List[str] = Field(default_factory=list, description="Store keys not found in backup")
restart_scheduled: bool = Field(description="Whether server restart was scheduled") restart_scheduled: bool = Field(description="Whether server restart was scheduled")
message: str = Field(description="Human-readable status message") message: str = Field(description="Human-readable status message")

View File

@@ -27,22 +27,7 @@ class AuthConfig(BaseSettings):
class StorageConfig(BaseSettings): class StorageConfig(BaseSettings):
"""Storage configuration.""" """Storage configuration."""
devices_file: str = "data/devices.json" database_file: str = "data/ledgrab.db"
templates_file: str = "data/capture_templates.json"
postprocessing_templates_file: str = "data/postprocessing_templates.json"
picture_sources_file: str = "data/picture_sources.json"
output_targets_file: str = "data/output_targets.json"
pattern_templates_file: str = "data/pattern_templates.json"
color_strip_sources_file: str = "data/color_strip_sources.json"
audio_sources_file: str = "data/audio_sources.json"
audio_templates_file: str = "data/audio_templates.json"
value_sources_file: str = "data/value_sources.json"
automations_file: str = "data/automations.json"
scene_presets_file: str = "data/scene_presets.json"
color_strip_processing_templates_file: str = "data/color_strip_processing_templates.json"
sync_clocks_file: str = "data/sync_clocks.json"
gradients_file: str = "data/gradients.json"
weather_sources_file: str = "data/weather_sources.json"
class MQTTConfig(BaseSettings): class MQTTConfig(BaseSettings):

View File

@@ -1,14 +1,13 @@
"""Auto-backup engine — periodic background backups of all configuration stores.""" """Auto-backup engine — periodic SQLite snapshot backups."""
import asyncio import asyncio
import json
import os import os
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
from pathlib import Path from pathlib import Path
from typing import Any, Dict, List, Optional from typing import List, Optional
from wled_controller import __version__ from wled_controller.storage.database import Database
from wled_controller.utils import atomic_write_json, get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
@@ -19,25 +18,21 @@ DEFAULT_SETTINGS = {
} }
# Skip the immediate-on-start backup if a recent backup exists within this window. # Skip the immediate-on-start backup if a recent backup exists within this window.
# Prevents rapid restarts from flooding the backup directory and rotating out
# good backups.
_STARTUP_BACKUP_COOLDOWN = timedelta(minutes=5) _STARTUP_BACKUP_COOLDOWN = timedelta(minutes=5)
_BACKUP_EXT = ".db"
class AutoBackupEngine: class AutoBackupEngine:
"""Creates periodic backups of all configuration stores.""" """Creates periodic SQLite snapshot backups of the database."""
def __init__( def __init__(
self, self,
settings_path: Path,
backup_dir: Path, backup_dir: Path,
store_map: Dict[str, str], db: Database,
storage_config: Any,
): ):
self._settings_path = Path(settings_path)
self._backup_dir = Path(backup_dir) self._backup_dir = Path(backup_dir)
self._store_map = store_map self._db = db
self._storage_config = storage_config
self._task: Optional[asyncio.Task] = None self._task: Optional[asyncio.Task] = None
self._last_backup_time: Optional[datetime] = None self._last_backup_time: Optional[datetime] = None
@@ -47,17 +42,13 @@ class AutoBackupEngine:
# ─── Settings persistence ────────────────────────────────── # ─── Settings persistence ──────────────────────────────────
def _load_settings(self) -> dict: def _load_settings(self) -> dict:
if self._settings_path.exists(): data = self._db.get_setting("auto_backup")
try: if data:
with open(self._settings_path, "r", encoding="utf-8") as f:
data = json.load(f)
return {**DEFAULT_SETTINGS, **data} return {**DEFAULT_SETTINGS, **data}
except Exception as e:
logger.warning(f"Failed to load auto-backup settings: {e}")
return dict(DEFAULT_SETTINGS) return dict(DEFAULT_SETTINGS)
def _save_settings(self) -> None: def _save_settings(self) -> None:
atomic_write_json(self._settings_path, { self._db.set_setting("auto_backup", {
"enabled": self._settings["enabled"], "enabled": self._settings["enabled"],
"interval_hours": self._settings["interval_hours"], "interval_hours": self._settings["interval_hours"],
"max_backups": self._settings["max_backups"], "max_backups": self._settings["max_backups"],
@@ -90,7 +81,7 @@ class AutoBackupEngine:
def _most_recent_backup_age(self) -> timedelta | None: def _most_recent_backup_age(self) -> timedelta | None:
"""Return the age of the newest backup file, or None if no backups exist.""" """Return the age of the newest backup file, or None if no backups exist."""
files = list(self._backup_dir.glob("*.json")) files = list(self._backup_dir.glob(f"*{_BACKUP_EXT}"))
if not files: if not files:
return None return None
newest = max(files, key=lambda p: p.stat().st_mtime) newest = max(files, key=lambda p: p.stat().st_mtime)
@@ -99,9 +90,6 @@ class AutoBackupEngine:
async def _backup_loop(self) -> None: async def _backup_loop(self) -> None:
try: try:
# Skip immediate backup if a recent one already exists.
# Prevents rapid restarts (crashes, restores) from flooding the
# backup directory and rotating out good backups.
age = self._most_recent_backup_age() age = self._most_recent_backup_age()
if age is None or age > _STARTUP_BACKUP_COOLDOWN: if age is None or age > _STARTUP_BACKUP_COOLDOWN:
await self._perform_backup() await self._perform_backup()
@@ -125,44 +113,22 @@ class AutoBackupEngine:
# ─── Backup operations ───────────────────────────────────── # ─── Backup operations ─────────────────────────────────────
async def _perform_backup(self) -> None: async def _perform_backup(self) -> None:
loop = asyncio.get_event_loop() await asyncio.to_thread(self._perform_backup_sync)
await loop.run_in_executor(None, self._perform_backup_sync)
def _perform_backup_sync(self) -> None: def _perform_backup_sync(self) -> None:
stores = {}
for store_key, config_attr in self._store_map.items():
file_path = Path(getattr(self._storage_config, config_attr))
if file_path.exists():
with open(file_path, "r", encoding="utf-8") as f:
stores[store_key] = json.load(f)
else:
stores[store_key] = {}
now = datetime.now(timezone.utc) now = datetime.now(timezone.utc)
backup = {
"meta": {
"format": "ledgrab-backup",
"format_version": 1,
"app_version": __version__,
"created_at": now.isoformat(),
"store_count": len(stores),
"auto_backup": True,
},
"stores": stores,
}
timestamp = now.strftime("%Y-%m-%dT%H%M%S") timestamp = now.strftime("%Y-%m-%dT%H%M%S")
filename = f"ledgrab-autobackup-{timestamp}.json" filename = f"ledgrab-backup-{timestamp}{_BACKUP_EXT}"
file_path = self._backup_dir / filename file_path = self._backup_dir / filename
atomic_write_json(file_path, backup) self._db.backup_to(file_path)
self._last_backup_time = now self._last_backup_time = now
logger.info(f"Auto-backup created: {filename}") logger.info(f"Backup created: {filename}")
def _prune_old_backups(self) -> None: def _prune_old_backups(self) -> None:
max_backups = self._settings["max_backups"] max_backups = self._settings["max_backups"]
files = sorted(self._backup_dir.glob("*.json"), key=lambda p: p.stat().st_mtime) files = sorted(self._backup_dir.glob(f"*{_BACKUP_EXT}"), key=lambda p: p.stat().st_mtime)
excess = len(files) - max_backups excess = len(files) - max_backups
if excess > 0: if excess > 0:
for f in files[:excess]: for f in files[:excess]:
@@ -195,7 +161,6 @@ class AutoBackupEngine:
self._settings["max_backups"] = max_backups self._settings["max_backups"] = max_backups
self._save_settings() self._save_settings()
# Restart or stop the loop
if enabled: if enabled:
self._start_loop() self._start_loop()
logger.info( logger.info(
@@ -205,14 +170,12 @@ class AutoBackupEngine:
self._cancel_loop() self._cancel_loop()
logger.info("Auto-backup disabled") logger.info("Auto-backup disabled")
# Prune if max_backups was reduced
self._prune_old_backups() self._prune_old_backups()
return self.get_settings() return self.get_settings()
def list_backups(self) -> List[dict]: def list_backups(self) -> List[dict]:
backups = [] backups = []
for f in sorted(self._backup_dir.glob("*.json"), key=lambda p: p.stat().st_mtime, reverse=True): for f in sorted(self._backup_dir.glob(f"*{_BACKUP_EXT}"), key=lambda p: p.stat().st_mtime, reverse=True):
stat = f.stat() stat = f.stat()
backups.append({ backups.append({
"filename": f.name, "filename": f.name,
@@ -226,7 +189,6 @@ class AutoBackupEngine:
if not filename or os.sep in filename or "/" in filename or ".." in filename: if not filename or os.sep in filename or "/" in filename or ".." in filename:
raise ValueError("Invalid filename") raise ValueError("Invalid filename")
target = (self._backup_dir / filename).resolve() target = (self._backup_dir / filename).resolve()
# Ensure resolved path is still inside the backup directory
if not target.is_relative_to(self._backup_dir.resolve()): if not target.is_relative_to(self._backup_dir.resolve()):
raise ValueError("Invalid filename") raise ValueError("Invalid filename")
return target return target
@@ -235,7 +197,6 @@ class AutoBackupEngine:
"""Manually trigger a backup and prune old ones. Returns the created backup info.""" """Manually trigger a backup and prune old ones. Returns the created backup info."""
await self._perform_backup() await self._perform_backup()
self._prune_old_backups() self._prune_old_backups()
# Return the most recent backup entry
backups = self.list_backups() backups = self.list_backups()
return backups[0] if backups else {} return backups[0] if backups else {}

View File

@@ -1,15 +1,14 @@
"""Seed data generator for demo mode. """Seed data generator for demo mode.
Populates the demo data directory with sample entities on first run, Populates the demo SQLite database with sample entities on first run,
giving new users a realistic out-of-the-box experience without needing giving new users a realistic out-of-the-box experience without needing
real hardware. real hardware.
""" """
import json import json
from datetime import datetime, timezone from datetime import datetime, timezone
from pathlib import Path
from wled_controller.config import StorageConfig from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
@@ -50,63 +49,48 @@ _SCENE_ID = "scene_demo0001"
_NOW = datetime.now(timezone.utc).isoformat() _NOW = datetime.now(timezone.utc).isoformat()
def _write_store(path: Path, json_key: str, items: dict) -> None: def _insert_entities(db: Database, table: str, items: dict) -> None:
"""Write a store JSON file with version wrapper.""" """Insert entity dicts into a SQLite table."""
path.parent.mkdir(parents=True, exist_ok=True) rows = []
data = { for entity_id, entity_data in items.items():
"version": "1.0.0", name = entity_data.get("name", "")
json_key: items, data_json = json.dumps(entity_data, ensure_ascii=False)
} rows.append((entity_id, name, data_json))
path.write_text(json.dumps(data, indent=2), encoding="utf-8") if rows:
logger.info(f"Seeded {len(items)} {json_key} -> {path}") db.bulk_insert(table, rows)
logger.info(f"Seeded {len(rows)} entities into {table}")
def _has_data(storage_config: StorageConfig) -> bool: def seed_demo_data(db: Database) -> None:
"""Check if any demo store file already has entities.""" """Populate demo database with sample entities.
for field_name in storage_config.model_fields:
value = getattr(storage_config, field_name)
if not isinstance(value, str):
continue
p = Path(value)
if p.exists() and p.stat().st_size > 20:
# File exists and is non-trivial — check if it has entities
try:
raw = json.loads(p.read_text(encoding="utf-8"))
for key, val in raw.items():
if key != "version" and isinstance(val, dict) and val:
return True
except Exception:
pass
return False
Only runs when the database has no entities in any table.
def seed_demo_data(storage_config: StorageConfig) -> None:
"""Populate demo data directory with sample entities.
Only runs when the demo data directory is empty (no existing entities).
Must be called BEFORE store constructors run so they load the seeded data. Must be called BEFORE store constructors run so they load the seeded data.
""" """
if _has_data(storage_config): # Check if any table already has data
for table in ["devices", "output_targets", "color_strip_sources",
"picture_sources", "audio_sources", "scene_presets"]:
if db.table_exists_with_data(table):
logger.info("Demo data already exists — skipping seed") logger.info("Demo data already exists — skipping seed")
return return
logger.info("Seeding demo data for first-run experience") logger.info("Seeding demo data for first-run experience")
_seed_devices(Path(storage_config.devices_file)) _insert_entities(db, "devices", _build_devices())
_seed_capture_templates(Path(storage_config.templates_file)) _insert_entities(db, "capture_templates", _build_capture_templates())
_seed_output_targets(Path(storage_config.output_targets_file)) _insert_entities(db, "output_targets", _build_output_targets())
_seed_picture_sources(Path(storage_config.picture_sources_file)) _insert_entities(db, "picture_sources", _build_picture_sources())
_seed_color_strip_sources(Path(storage_config.color_strip_sources_file)) _insert_entities(db, "color_strip_sources", _build_color_strip_sources())
_seed_audio_sources(Path(storage_config.audio_sources_file)) _insert_entities(db, "audio_sources", _build_audio_sources())
_seed_scene_presets(Path(storage_config.scene_presets_file)) _insert_entities(db, "scene_presets", _build_scene_presets())
logger.info("Demo seed data complete") logger.info("Demo seed data complete")
# ── Devices ──────────────────────────────────────────────────────── # ── Devices ────────────────────────────────────────────────────────
def _seed_devices(path: Path) -> None: def _build_devices() -> dict:
devices = { return {
_DEVICE_IDS["strip"]: { _DEVICE_IDS["strip"]: {
"id": _DEVICE_IDS["strip"], "id": _DEVICE_IDS["strip"],
"name": "Demo LED Strip", "name": "Demo LED Strip",
@@ -138,13 +122,12 @@ def _seed_devices(path: Path) -> None:
"updated_at": _NOW, "updated_at": _NOW,
}, },
} }
_write_store(path, "devices", devices)
# ── Capture Templates ────────────────────────────────────────────── # ── Capture Templates ──────────────────────────────────────────────
def _seed_capture_templates(path: Path) -> None: def _build_capture_templates() -> dict:
templates = { return {
_TPL_ID: { _TPL_ID: {
"id": _TPL_ID, "id": _TPL_ID,
"name": "Demo Capture", "name": "Demo Capture",
@@ -156,13 +139,12 @@ def _seed_capture_templates(path: Path) -> None:
"updated_at": _NOW, "updated_at": _NOW,
}, },
} }
_write_store(path, "templates", templates)
# ── Output Targets ───────────────────────────────────────────────── # ── Output Targets ─────────────────────────────────────────────────
def _seed_output_targets(path: Path) -> None: def _build_output_targets() -> dict:
targets = { return {
_TARGET_IDS["strip"]: { _TARGET_IDS["strip"]: {
"id": _TARGET_IDS["strip"], "id": _TARGET_IDS["strip"],
"name": "Strip — Gradient", "name": "Strip — Gradient",
@@ -200,13 +182,12 @@ def _seed_output_targets(path: Path) -> None:
"updated_at": _NOW, "updated_at": _NOW,
}, },
} }
_write_store(path, "output_targets", targets)
# ── Picture Sources ──────────────────────────────────────────────── # ── Picture Sources ────────────────────────────────────────────────
def _seed_picture_sources(path: Path) -> None: def _build_picture_sources() -> dict:
sources = { return {
_PS_IDS["main"]: { _PS_IDS["main"]: {
"id": _PS_IDS["main"], "id": _PS_IDS["main"],
"name": "Demo Display 1080p", "name": "Demo Display 1080p",
@@ -218,7 +199,6 @@ def _seed_picture_sources(path: Path) -> None:
"tags": ["demo"], "tags": ["demo"],
"created_at": _NOW, "created_at": _NOW,
"updated_at": _NOW, "updated_at": _NOW,
# Nulls for non-applicable subclass fields
"source_stream_id": None, "source_stream_id": None,
"postprocessing_template_id": None, "postprocessing_template_id": None,
"image_source": None, "image_source": None,
@@ -253,13 +233,12 @@ def _seed_picture_sources(path: Path) -> None:
"clock_id": None, "clock_id": None,
}, },
} }
_write_store(path, "picture_sources", sources)
# ── Color Strip Sources ──────────────────────────────────────────── # ── Color Strip Sources ────────────────────────────────────────────
def _seed_color_strip_sources(path: Path) -> None: def _build_color_strip_sources() -> dict:
sources = { return {
_CSS_IDS["gradient"]: { _CSS_IDS["gradient"]: {
"id": _CSS_IDS["gradient"], "id": _CSS_IDS["gradient"],
"name": "Rainbow Gradient", "name": "Rainbow Gradient",
@@ -338,13 +317,12 @@ def _seed_color_strip_sources(path: Path) -> None:
"updated_at": _NOW, "updated_at": _NOW,
}, },
} }
_write_store(path, "color_strip_sources", sources)
# ── Audio Sources ────────────────────────────────────────────────── # ── Audio Sources ──────────────────────────────────────────────────
def _seed_audio_sources(path: Path) -> None: def _build_audio_sources() -> dict:
sources = { return {
_AS_IDS["system"]: { _AS_IDS["system"]: {
"id": _AS_IDS["system"], "id": _AS_IDS["system"],
"name": "Demo System Audio", "name": "Demo System Audio",
@@ -356,7 +334,6 @@ def _seed_audio_sources(path: Path) -> None:
"tags": ["demo"], "tags": ["demo"],
"created_at": _NOW, "created_at": _NOW,
"updated_at": _NOW, "updated_at": _NOW,
# Forward-compat null fields
"audio_source_id": None, "audio_source_id": None,
"channel": None, "channel": None,
}, },
@@ -370,19 +347,17 @@ def _seed_audio_sources(path: Path) -> None:
"tags": ["demo"], "tags": ["demo"],
"created_at": _NOW, "created_at": _NOW,
"updated_at": _NOW, "updated_at": _NOW,
# Forward-compat null fields
"device_index": None, "device_index": None,
"is_loopback": None, "is_loopback": None,
"audio_template_id": None, "audio_template_id": None,
}, },
} }
_write_store(path, "audio_sources", sources)
# ── Scene Presets ────────────────────────────────────────────────── # ── Scene Presets ──────────────────────────────────────────────────
def _seed_scene_presets(path: Path) -> None: def _build_scene_presets() -> dict:
presets = { return {
_SCENE_ID: { _SCENE_ID: {
"id": _SCENE_ID, "id": _SCENE_ID,
"name": "Demo Ambient", "name": "Demo Ambient",
@@ -409,4 +384,3 @@ def _seed_scene_presets(path: Path) -> None:
"updated_at": _NOW, "updated_at": _NOW,
}, },
} }
_write_store(path, "scene_presets", presets)

View File

@@ -41,7 +41,7 @@ from wled_controller.core.mqtt.mqtt_service import MQTTService
from wled_controller.core.devices.mqtt_client import set_mqtt_service from wled_controller.core.devices.mqtt_client import set_mqtt_service
from wled_controller.core.backup.auto_backup import AutoBackupEngine from wled_controller.core.backup.auto_backup import AutoBackupEngine
from wled_controller.core.processing.os_notification_listener import OsNotificationListener from wled_controller.core.processing.os_notification_listener import OsNotificationListener
from wled_controller.api.routes.system import STORE_MAP from wled_controller.storage.database import Database
from wled_controller.utils import setup_logging, get_logger, install_broadcast_handler from wled_controller.utils import setup_logging, get_logger, install_broadcast_handler
# Initialize logging # Initialize logging
@@ -52,29 +52,32 @@ logger = get_logger(__name__)
# Get configuration # Get configuration
config = get_config() config = get_config()
# Seed demo data before stores are loaded (first-run only) # Initialize SQLite database
db = Database(config.storage.database_file)
# Seed demo data after DB is ready (first-run only)
if config.demo: if config.demo:
from wled_controller.core.demo_seed import seed_demo_data from wled_controller.core.demo_seed import seed_demo_data
seed_demo_data(config.storage) seed_demo_data(db)
# Initialize storage and processing # Initialize storage and processing
device_store = DeviceStore(config.storage.devices_file) device_store = DeviceStore(db)
template_store = TemplateStore(config.storage.templates_file) template_store = TemplateStore(db)
pp_template_store = PostprocessingTemplateStore(config.storage.postprocessing_templates_file) pp_template_store = PostprocessingTemplateStore(db)
picture_source_store = PictureSourceStore(config.storage.picture_sources_file) picture_source_store = PictureSourceStore(db)
output_target_store = OutputTargetStore(config.storage.output_targets_file) output_target_store = OutputTargetStore(db)
pattern_template_store = PatternTemplateStore(config.storage.pattern_templates_file) pattern_template_store = PatternTemplateStore(db)
color_strip_store = ColorStripStore(config.storage.color_strip_sources_file) color_strip_store = ColorStripStore(db)
audio_source_store = AudioSourceStore(config.storage.audio_sources_file) audio_source_store = AudioSourceStore(db)
audio_template_store = AudioTemplateStore(config.storage.audio_templates_file) audio_template_store = AudioTemplateStore(db)
value_source_store = ValueSourceStore(config.storage.value_sources_file) value_source_store = ValueSourceStore(db)
automation_store = AutomationStore(config.storage.automations_file) automation_store = AutomationStore(db)
scene_preset_store = ScenePresetStore(config.storage.scene_presets_file) scene_preset_store = ScenePresetStore(db)
sync_clock_store = SyncClockStore(config.storage.sync_clocks_file) sync_clock_store = SyncClockStore(db)
cspt_store = ColorStripProcessingTemplateStore(config.storage.color_strip_processing_templates_file) cspt_store = ColorStripProcessingTemplateStore(db)
gradient_store = GradientStore(config.storage.gradients_file) gradient_store = GradientStore(db)
gradient_store.migrate_palette_references(color_strip_store) gradient_store.migrate_palette_references(color_strip_store)
weather_source_store = WeatherSourceStore(config.storage.weather_sources_file) weather_source_store = WeatherSourceStore(db)
sync_clock_manager = SyncClockManager(sync_clock_store) sync_clock_manager = SyncClockManager(sync_clock_store)
weather_manager = WeatherManager(weather_source_store) weather_manager = WeatherManager(weather_source_store)
@@ -156,34 +159,18 @@ async def lifespan(app: FastAPI):
device_store=device_store, device_store=device_store,
) )
# Create auto-backup engine — derive paths from storage config so that # Create auto-backup engine — derive paths from database location so that
# demo mode auto-backups go to data/demo/ instead of data/. # demo mode auto-backups go to data/demo/ instead of data/.
_data_dir = Path(config.storage.devices_file).parent _data_dir = Path(config.storage.database_file).parent
auto_backup_engine = AutoBackupEngine( auto_backup_engine = AutoBackupEngine(
settings_path=_data_dir / "auto_backup_settings.json",
backup_dir=_data_dir / "backups", backup_dir=_data_dir / "backups",
store_map=STORE_MAP, db=db,
storage_config=config.storage,
)
# Verify STORE_MAP covers all StorageConfig file fields.
# Catches missed additions early (at startup) rather than silently
# excluding new stores from backups.
storage_attrs = {
attr for attr in config.storage.model_fields
if attr.endswith("_file")
}
mapped_attrs = set(STORE_MAP.values())
unmapped = storage_attrs - mapped_attrs
if unmapped:
logger.warning(
f"StorageConfig fields not in STORE_MAP (missing from backups): "
f"{sorted(unmapped)}"
) )
# Initialize API dependencies # Initialize API dependencies
init_dependencies( init_dependencies(
device_store, template_store, processor_manager, device_store, template_store, processor_manager,
database=db,
pp_template_store=pp_template_store, pp_template_store=pp_template_store,
pattern_template_store=pattern_template_store, pattern_template_store=pattern_template_store,
picture_source_store=picture_source_store, picture_source_store=picture_source_store,

View File

@@ -191,10 +191,9 @@ import {
import { import {
openSettingsModal, closeSettingsModal, switchSettingsTab, openSettingsModal, closeSettingsModal, switchSettingsTab,
downloadBackup, handleRestoreFileSelected, downloadBackup, handleRestoreFileSelected,
saveAutoBackupSettings, restoreSavedBackup, downloadSavedBackup, deleteSavedBackup, saveAutoBackupSettings, triggerBackupNow, restoreSavedBackup, downloadSavedBackup, deleteSavedBackup,
restartServer, saveMqttSettings, restartServer, saveMqttSettings,
loadApiKeysList, loadApiKeysList,
downloadPartialExport, handlePartialImportFileSelected,
connectLogViewer, disconnectLogViewer, clearLogViewer, applyLogFilter, connectLogViewer, disconnectLogViewer, clearLogViewer, applyLogFilter,
openLogOverlay, closeLogOverlay, openLogOverlay, closeLogOverlay,
loadLogLevel, setLogLevel, loadLogLevel, setLogLevel,
@@ -536,21 +535,20 @@ Object.assign(window, {
openCommandPalette, openCommandPalette,
closeCommandPalette, closeCommandPalette,
// settings (tabs / backup / restore / auto-backup / MQTT / partial export-import / api keys / log level) // settings (tabs / backup / restore / auto-backup / MQTT / api keys / log level)
openSettingsModal, openSettingsModal,
closeSettingsModal, closeSettingsModal,
switchSettingsTab, switchSettingsTab,
downloadBackup, downloadBackup,
handleRestoreFileSelected, handleRestoreFileSelected,
saveAutoBackupSettings, saveAutoBackupSettings,
triggerBackupNow,
restoreSavedBackup, restoreSavedBackup,
downloadSavedBackup, downloadSavedBackup,
deleteSavedBackup, deleteSavedBackup,
restartServer, restartServer,
saveMqttSettings, saveMqttSettings,
loadApiKeysList, loadApiKeysList,
downloadPartialExport,
handlePartialImportFileSelected,
connectLogViewer, connectLogViewer,
disconnectLogViewer, disconnectLogViewer,
clearLogViewer, clearLogViewer,

View File

@@ -419,6 +419,22 @@ export async function saveAutoBackupSettings(): Promise<void> {
} }
} }
export async function triggerBackupNow(): Promise<void> {
try {
const resp = await fetchWithAuth('/system/auto-backup/trigger', { method: 'POST' });
if (!resp.ok) {
const err = await resp.json().catch(() => ({}));
throw new Error(err.detail || `HTTP ${resp.status}`);
}
showToast(t('settings.auto_backup.backup_created'), 'success');
loadBackupList();
loadAutoBackupSettings();
} catch (err) {
console.error('Backup failed:', err);
showToast(t('settings.auto_backup.backup_error') + ': ' + err.message, 'error');
}
}
// ─── Saved backup list ──────────────────────────────────── // ─── Saved backup list ────────────────────────────────────
export async function loadBackupList(): Promise<void> { export async function loadBackupList(): Promise<void> {
@@ -566,76 +582,6 @@ export async function loadApiKeysList(): Promise<void> {
} }
} }
// ─── Partial Export / Import ───────────────────────────────────
export async function downloadPartialExport(): Promise<void> {
const storeKey = (document.getElementById('settings-partial-store') as HTMLSelectElement).value;
try {
const resp = await fetchWithAuth(`/system/export/${encodeURIComponent(storeKey)}`, { timeout: 30000 });
if (!resp.ok) {
const err = await resp.json().catch(() => ({}));
throw new Error(err.detail || `HTTP ${resp.status}`);
}
const blob = await resp.blob();
const disposition = resp.headers.get('Content-Disposition') || '';
const match = disposition.match(/filename="(.+?)"/);
const filename = match ? match[1] : `ledgrab-${storeKey}.json`;
const a = document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = filename;
document.body.appendChild(a);
a.click();
a.remove();
URL.revokeObjectURL(a.href);
showToast(t('settings.partial.export_success'), 'success');
} catch (err) {
console.error('Partial export failed:', err);
showToast(t('settings.partial.export_error') + ': ' + err.message, 'error');
}
}
export async function handlePartialImportFileSelected(input: HTMLInputElement): Promise<void> {
const file = input.files![0];
input.value = '';
if (!file) return;
const storeKey = (document.getElementById('settings-partial-store') as HTMLSelectElement).value;
const merge = (document.getElementById('settings-partial-merge') as HTMLInputElement).checked;
const confirmMsg = merge
? t('settings.partial.import_confirm_merge').replace('{store}', storeKey)
: t('settings.partial.import_confirm_replace').replace('{store}', storeKey);
const confirmed = await showConfirm(confirmMsg);
if (!confirmed) return;
try {
const formData = new FormData();
formData.append('file', file);
const url = `${API_BASE}/system/import/${encodeURIComponent(storeKey)}?merge=${merge}`;
const resp = await fetch(url, {
method: 'POST',
headers: { 'Authorization': `Bearer ${apiKey}` },
body: formData,
});
if (!resp.ok) {
const err = await resp.json().catch(() => ({}));
throw new Error(err.detail || `HTTP ${resp.status}`);
}
const data = await resp.json();
showToast(data.message || t('settings.partial.import_success'), 'success');
settingsModal.forceClose();
} catch (err) {
console.error('Partial import failed:', err);
showToast(t('settings.partial.import_error') + ': ' + err.message, 'error');
}
}
// ─── Log Level ──────────────────────────────────────────────── // ─── Log Level ────────────────────────────────────────────────
export async function loadLogLevel(): Promise<void> { export async function loadLogLevel(): Promise<void> {

View File

@@ -358,14 +358,13 @@ interface Window {
downloadBackup: (...args: any[]) => any; downloadBackup: (...args: any[]) => any;
handleRestoreFileSelected: (...args: any[]) => any; handleRestoreFileSelected: (...args: any[]) => any;
saveAutoBackupSettings: (...args: any[]) => any; saveAutoBackupSettings: (...args: any[]) => any;
triggerBackupNow: (...args: any[]) => any;
restoreSavedBackup: (...args: any[]) => any; restoreSavedBackup: (...args: any[]) => any;
downloadSavedBackup: (...args: any[]) => any; downloadSavedBackup: (...args: any[]) => any;
deleteSavedBackup: (...args: any[]) => any; deleteSavedBackup: (...args: any[]) => any;
restartServer: (...args: any[]) => any; restartServer: (...args: any[]) => any;
saveMqttSettings: (...args: any[]) => any; saveMqttSettings: (...args: any[]) => any;
loadApiKeysList: (...args: any[]) => any; loadApiKeysList: (...args: any[]) => any;
downloadPartialExport: (...args: any[]) => any;
handlePartialImportFileSelected: (...args: any[]) => any;
connectLogViewer: (...args: any[]) => any; connectLogViewer: (...args: any[]) => any;
disconnectLogViewer: (...args: any[]) => any; disconnectLogViewer: (...args: any[]) => any;
clearLogViewer: (...args: any[]) => any; clearLogViewer: (...args: any[]) => any;

View File

@@ -1591,6 +1591,9 @@
"settings.auto_backup.save": "Save Settings", "settings.auto_backup.save": "Save Settings",
"settings.auto_backup.saved": "Auto-backup settings saved", "settings.auto_backup.saved": "Auto-backup settings saved",
"settings.auto_backup.save_error": "Failed to save auto-backup settings", "settings.auto_backup.save_error": "Failed to save auto-backup settings",
"settings.auto_backup.backup_now": "Backup Now",
"settings.auto_backup.backup_created": "Backup created",
"settings.auto_backup.backup_error": "Backup failed",
"settings.auto_backup.last_backup": "Last backup", "settings.auto_backup.last_backup": "Last backup",
"settings.auto_backup.never": "Never", "settings.auto_backup.never": "Never",
"settings.saved_backups.label": "Saved Backups", "settings.saved_backups.label": "Saved Backups",

View File

@@ -1518,6 +1518,9 @@
"settings.auto_backup.save": "Сохранить настройки", "settings.auto_backup.save": "Сохранить настройки",
"settings.auto_backup.saved": "Настройки авто-бэкапа сохранены", "settings.auto_backup.saved": "Настройки авто-бэкапа сохранены",
"settings.auto_backup.save_error": "Не удалось сохранить настройки авто-бэкапа", "settings.auto_backup.save_error": "Не удалось сохранить настройки авто-бэкапа",
"settings.auto_backup.backup_now": "Создать бэкап",
"settings.auto_backup.backup_created": "Бэкап создан",
"settings.auto_backup.backup_error": "Ошибка создания бэкапа",
"settings.auto_backup.last_backup": "Последний бэкап", "settings.auto_backup.last_backup": "Последний бэкап",
"settings.auto_backup.never": "Никогда", "settings.auto_backup.never": "Никогда",
"settings.saved_backups.label": "Сохранённые копии", "settings.saved_backups.label": "Сохранённые копии",

View File

@@ -1518,6 +1518,9 @@
"settings.auto_backup.save": "保存设置", "settings.auto_backup.save": "保存设置",
"settings.auto_backup.saved": "自动备份设置已保存", "settings.auto_backup.saved": "自动备份设置已保存",
"settings.auto_backup.save_error": "保存自动备份设置失败", "settings.auto_backup.save_error": "保存自动备份设置失败",
"settings.auto_backup.backup_now": "立即备份",
"settings.auto_backup.backup_created": "备份已创建",
"settings.auto_backup.backup_error": "备份失败",
"settings.auto_backup.last_backup": "上次备份", "settings.auto_backup.last_backup": "上次备份",
"settings.auto_backup.never": "从未", "settings.auto_backup.never": "从未",
"settings.saved_backups.label": "已保存的备份", "settings.saved_backups.label": "已保存的备份",

View File

@@ -1,4 +1,4 @@
"""Audio source storage using JSON files.""" """Audio source storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
@@ -11,7 +11,8 @@ from wled_controller.storage.audio_source import (
MonoAudioSource, MonoAudioSource,
MultichannelAudioSource, MultichannelAudioSource,
) )
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.utils import resolve_ref from wled_controller.storage.utils import resolve_ref
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
@@ -29,18 +30,18 @@ class ResolvedAudioSource(NamedTuple):
freq_high: Optional[float] = None freq_high: Optional[float] = None
class AudioSourceStore(BaseJsonStore[AudioSource]): class AudioSourceStore(BaseSqliteStore[AudioSource]):
"""Persistent storage for audio sources.""" """Persistent storage for audio sources."""
_json_key = "audio_sources" _table_name = "audio_sources"
_entity_name = "Audio source" _entity_name = "Audio source"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, AudioSource.from_dict) super().__init__(db, AudioSource.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_sources = BaseJsonStore.get_all get_all_sources = BaseSqliteStore.get_all
get_source = BaseJsonStore.get get_source = BaseSqliteStore.get
def get_mono_sources(self) -> List[MonoAudioSource]: def get_mono_sources(self) -> List[MonoAudioSource]:
"""Return only mono audio sources (for CSS dropdown).""" """Return only mono audio sources (for CSS dropdown)."""
@@ -111,7 +112,7 @@ class AudioSourceStore(BaseJsonStore[AudioSource]):
) )
self._items[sid] = source self._items[sid] = source
self._save() self._save_item(sid, source)
logger.info(f"Created audio source: {name} ({sid}, type={source_type})") logger.info(f"Created audio source: {name} ({sid}, type={source_type})")
return source return source
@@ -185,7 +186,7 @@ class AudioSourceStore(BaseJsonStore[AudioSource]):
source.freq_high = freq_high source.freq_high = freq_high
source.updated_at = datetime.now(timezone.utc) source.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(source_id, source)
logger.info(f"Updated audio source: {source_id}") logger.info(f"Updated audio source: {source_id}")
return source return source
@@ -207,7 +208,7 @@ class AudioSourceStore(BaseJsonStore[AudioSource]):
) )
del self._items[source_id] del self._items[source_id]
self._save() self._delete_item(source_id)
logger.info(f"Deleted audio source: {source_id}") logger.info(f"Deleted audio source: {source_id}")

View File

@@ -1,4 +1,4 @@
"""Audio template storage using JSON files.""" """Audio template storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
@@ -6,30 +6,31 @@ from typing import Any, Dict, List, Optional
from wled_controller.core.audio.factory import AudioEngineRegistry from wled_controller.core.audio.factory import AudioEngineRegistry
from wled_controller.storage.audio_template import AudioCaptureTemplate from wled_controller.storage.audio_template import AudioCaptureTemplate
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class AudioTemplateStore(BaseJsonStore[AudioCaptureTemplate]): class AudioTemplateStore(BaseSqliteStore[AudioCaptureTemplate]):
"""Storage for audio capture templates. """Storage for audio capture templates.
All templates are persisted to the JSON file. All templates are persisted to the database.
On startup, if no templates exist, one is auto-created using the On startup, if no templates exist, one is auto-created using the
highest-priority available engine. highest-priority available engine.
""" """
_json_key = "templates" _table_name = "audio_templates"
_entity_name = "Audio capture template" _entity_name = "Audio capture template"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, AudioCaptureTemplate.from_dict) super().__init__(db, AudioCaptureTemplate.from_dict)
self._ensure_initial_template() self._ensure_initial_template()
# Backward-compatible aliases # Backward-compatible aliases
get_all_templates = BaseJsonStore.get_all get_all_templates = BaseSqliteStore.get_all
get_template = BaseJsonStore.get get_template = BaseSqliteStore.get
def _ensure_initial_template(self) -> None: def _ensure_initial_template(self) -> None:
"""Auto-create a template if none exist, using the best available engine.""" """Auto-create a template if none exist, using the best available engine."""
@@ -93,7 +94,7 @@ class AudioTemplateStore(BaseJsonStore[AudioCaptureTemplate]):
) )
self._items[template_id] = template self._items[template_id] = template
self._save() self._save_item(template_id, template)
logger.info(f"Created audio template: {name} ({template_id})") logger.info(f"Created audio template: {name} ({template_id})")
return template return template
@@ -121,7 +122,7 @@ class AudioTemplateStore(BaseJsonStore[AudioCaptureTemplate]):
template.tags = tags template.tags = tags
template.updated_at = datetime.now(timezone.utc) template.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(template_id, template)
logger.info(f"Updated audio template: {template_id}") logger.info(f"Updated audio template: {template_id}")
return template return template
@@ -152,5 +153,5 @@ class AudioTemplateStore(BaseJsonStore[AudioCaptureTemplate]):
) )
del self._items[template_id] del self._items[template_id]
self._save() self._delete_item(template_id)
logger.info(f"Deleted audio template: {template_id}") logger.info(f"Deleted audio template: {template_id}")

View File

@@ -1,27 +1,28 @@
"""Automation storage using JSON files.""" """Automation storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.automation import Automation, Condition from wled_controller.storage.automation import Automation, Condition
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class AutomationStore(BaseJsonStore[Automation]): class AutomationStore(BaseSqliteStore[Automation]):
_json_key = "automations" _table_name = "automations"
_entity_name = "Automation" _entity_name = "Automation"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, Automation.from_dict) super().__init__(db, Automation.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_automations = BaseJsonStore.get_all get_all_automations = BaseSqliteStore.get_all
get_automation = BaseJsonStore.get get_automation = BaseSqliteStore.get
delete_automation = BaseJsonStore.delete delete_automation = BaseSqliteStore.delete
def create_automation( def create_automation(
self, self,
@@ -56,7 +57,7 @@ class AutomationStore(BaseJsonStore[Automation]):
) )
self._items[automation_id] = automation self._items[automation_id] = automation
self._save() self._save_item(automation_id, automation)
logger.info(f"Created automation: {name} ({automation_id})") logger.info(f"Created automation: {name} ({automation_id})")
return automation return automation
@@ -93,6 +94,6 @@ class AutomationStore(BaseJsonStore[Automation]):
automation.tags = tags automation.tags = tags
automation.updated_at = datetime.now(timezone.utc) automation.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(automation_id, automation)
logger.info(f"Updated automation: {automation_id}") logger.info(f"Updated automation: {automation_id}")
return automation return automation

View File

@@ -0,0 +1,168 @@
"""Base class for SQLite-backed entity stores.
Drop-in replacement for BaseJsonStore with the same public API.
Each store keeps an in-memory cache (``_items``) for fast reads;
writes go through to SQLite immediately (write-through cache).
"""
import asyncio
import threading
from typing import Callable, Dict, Generic, List, TypeVar
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger
T = TypeVar("T")
logger = get_logger(__name__)
class BaseSqliteStore(Generic[T]):
"""SQLite-backed entity store with the same API as BaseJsonStore.
Subclasses must set class attributes:
- ``_table_name``: SQL table name (e.g. ``"sync_clocks"``)
- ``_entity_name``: human label for errors (e.g. ``"Sync clock"``)
"""
_table_name: str
_entity_name: str
def __init__(self, db: Database, deserializer: Callable[[dict], T]):
self._db = db
self._items: Dict[str, T] = {}
self._deserializer = deserializer
self._lock = threading.RLock()
self._load()
# -- I/O -----------------------------------------------------------------
def _load(self) -> None:
"""Load all rows from SQLite into the in-memory cache."""
rows = self._db.load_all(self._table_name)
loaded = 0
for item_dict in rows:
item_id = item_dict.get("id")
if not item_id:
logger.error(f"Skipping {self._entity_name} row with no id")
continue
try:
self._items[item_id] = self._deserializer(item_dict)
loaded += 1
except Exception as e:
logger.error(
f"Failed to load {self._entity_name} {item_id}: {e}",
exc_info=True,
)
if loaded > 0:
logger.info(f"Loaded {loaded} {self._table_name} from database")
logger.info(
f"{self._entity_name} store initialized with {len(self._items)} items"
)
def _save_item(self, item_id: str, item: T) -> None:
"""Persist a single item to SQLite (write-through)."""
data = item.to_dict()
name = data.get("name", "")
self._db.upsert(self._table_name, item_id, name, data)
def _delete_item(self, item_id: str) -> None:
"""Delete a single item from SQLite."""
self._db.delete_row(self._table_name, item_id)
def _save_all(self, *, force: bool = False) -> None:
"""Persist all items to SQLite.
Used during shutdown to ensure in-memory state is flushed.
When ``force`` is True, bypasses the frozen-writes check.
"""
from wled_controller.storage.database import _writes_frozen
if _writes_frozen and not force:
logger.warning(f"Save blocked (frozen after restore): {self._table_name}")
return
items_to_write = []
with self._lock:
for item_id, item in self._items.items():
data = item.to_dict()
import json
items_to_write.append((
item_id,
data.get("name", ""),
json.dumps(data, ensure_ascii=False),
))
if items_to_write:
# Use transaction for atomicity: clear + re-insert
with self._db.transaction() as conn:
conn.execute(f"DELETE FROM [{self._table_name}]")
conn.executemany(
f"INSERT INTO [{self._table_name}] (id, name, data) VALUES (?, ?, ?)",
items_to_write,
)
# -- Backward compat: _save() used by subclass create/update methods -----
def _save(self, *, force: bool = False) -> None:
"""Compatibility shim: save all items.
Subclasses that call ``self._save()`` after mutating ``self._items``
will trigger a full flush. For better performance, prefer calling
``self._save_item(id, item)`` for single-entity mutations.
"""
self._save_all(force=force)
async def _save_async(self) -> None:
"""Async wrapper — runs ``_save()`` in a thread."""
await asyncio.to_thread(self._save)
# -- Common CRUD (identical API to BaseJsonStore) ------------------------
def get_all(self) -> List[T]:
with self._lock:
return list(self._items.values())
def get(self, item_id: str) -> T:
with self._lock:
if item_id not in self._items:
from wled_controller.storage.base_store import EntityNotFoundError
raise EntityNotFoundError(f"{self._entity_name} not found: {item_id}")
return self._items[item_id]
def delete(self, item_id: str) -> None:
with self._lock:
if item_id not in self._items:
from wled_controller.storage.base_store import EntityNotFoundError
raise EntityNotFoundError(f"{self._entity_name} not found: {item_id}")
del self._items[item_id]
self._delete_item(item_id)
logger.info(f"Deleted {self._entity_name}: {item_id}")
async def async_delete(self, item_id: str) -> None:
"""Async version of ``delete()``."""
with self._lock:
if item_id not in self._items:
from wled_controller.storage.base_store import EntityNotFoundError
raise EntityNotFoundError(f"{self._entity_name} not found: {item_id}")
del self._items[item_id]
await asyncio.to_thread(self._delete_item, item_id)
logger.info(f"Deleted {self._entity_name}: {item_id}")
def count(self) -> int:
with self._lock:
return len(self._items)
# -- Helpers -------------------------------------------------------------
def _check_name_unique(self, name: str, exclude_id: str = None) -> None:
"""Raise ValueError if *name* is empty or already taken.
Must be called while holding ``self._lock``.
"""
if not name or not name.strip():
raise ValueError("Name is required")
for item_id, item in self._items.items():
if item_id != exclude_id and getattr(item, "name", None) == name:
raise ValueError(
f"{self._entity_name} with name '{name}' already exists"
)

View File

@@ -1,4 +1,4 @@
"""Color strip processing template storage using JSON files.""" """Color strip processing template storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
@@ -6,32 +6,33 @@ from typing import List, Optional
from wled_controller.core.filters.filter_instance import FilterInstance from wled_controller.core.filters.filter_instance import FilterInstance
from wled_controller.core.filters.registry import FilterRegistry from wled_controller.core.filters.registry import FilterRegistry
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.color_strip_processing_template import ColorStripProcessingTemplate from wled_controller.storage.color_strip_processing_template import ColorStripProcessingTemplate
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class ColorStripProcessingTemplateStore(BaseJsonStore[ColorStripProcessingTemplate]): class ColorStripProcessingTemplateStore(BaseSqliteStore[ColorStripProcessingTemplate]):
"""Storage for color strip processing templates. """Storage for color strip processing templates.
All templates are persisted to the JSON file. All templates are persisted to the database.
On startup, if no templates exist, a default one is auto-created. On startup, if no templates exist, a default one is auto-created.
""" """
_json_key = "color_strip_processing_templates" _table_name = "color_strip_processing_templates"
_entity_name = "Color strip processing template" _entity_name = "Color strip processing template"
_version = "1.0.0" _version = "1.0.0"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, ColorStripProcessingTemplate.from_dict) super().__init__(db, ColorStripProcessingTemplate.from_dict)
self._ensure_initial_template() self._ensure_initial_template()
# Backward-compatible aliases # Backward-compatible aliases
get_all_templates = BaseJsonStore.get_all get_all_templates = BaseSqliteStore.get_all
get_template = BaseJsonStore.get get_template = BaseSqliteStore.get
delete_template = BaseJsonStore.delete delete_template = BaseSqliteStore.delete
def _ensure_initial_template(self) -> None: def _ensure_initial_template(self) -> None:
"""Auto-create a default color strip processing template if none exist.""" """Auto-create a default color strip processing template if none exist."""
@@ -96,7 +97,7 @@ class ColorStripProcessingTemplateStore(BaseJsonStore[ColorStripProcessingTempla
) )
self._items[template_id] = template self._items[template_id] = template
self._save() self._save_item(template_id, template)
logger.info(f"Created color strip processing template: {name} ({template_id})") logger.info(f"Created color strip processing template: {name} ({template_id})")
return template return template
@@ -123,7 +124,7 @@ class ColorStripProcessingTemplateStore(BaseJsonStore[ColorStripProcessingTempla
template.tags = tags template.tags = tags
template.updated_at = datetime.now(timezone.utc) template.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(template_id, template)
logger.info(f"Updated color strip processing template: {template_id}") logger.info(f"Updated color strip processing template: {template_id}")
return template return template

View File

@@ -1,10 +1,11 @@
"""Color strip source storage using JSON files.""" """Color strip source storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List from typing import List
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.utils import resolve_ref from wled_controller.storage.utils import resolve_ref
from wled_controller.storage.color_strip_source import ( from wled_controller.storage.color_strip_source import (
ColorStripSource, ColorStripSource,
@@ -17,18 +18,18 @@ from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class ColorStripStore(BaseJsonStore[ColorStripSource]): class ColorStripStore(BaseSqliteStore[ColorStripSource]):
"""Persistent storage for color strip sources.""" """Persistent storage for color strip sources."""
_json_key = "color_strip_sources" _table_name = "color_strip_sources"
_entity_name = "Color strip source" _entity_name = "Color strip source"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, ColorStripSource.from_dict) super().__init__(db, ColorStripSource.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_sources = BaseJsonStore.get_all get_all_sources = BaseSqliteStore.get_all
delete_source = BaseJsonStore.delete delete_source = BaseSqliteStore.delete
def get_source(self, source_id: str) -> ColorStripSource: def get_source(self, source_id: str) -> ColorStripSource:
"""Get a color strip source by ID (alias for get()).""" """Get a color strip source by ID (alias for get())."""
@@ -67,7 +68,7 @@ class ColorStripStore(BaseJsonStore[ColorStripSource]):
) )
self._items[source_id] = source self._items[source_id] = source
self._save() self._save_item(source_id, source)
logger.info(f"Created color strip source: {name} ({source_id}, type={source_type})") logger.info(f"Created color strip source: {name} ({source_id}, type={source_type})")
return source return source
@@ -110,7 +111,7 @@ class ColorStripStore(BaseJsonStore[ColorStripSource]):
source.apply_update(**kwargs) source.apply_update(**kwargs)
source.updated_at = datetime.now(timezone.utc) source.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(source_id, source)
logger.info(f"Updated color strip source: {source_id}") logger.info(f"Updated color strip source: {source_id}")
return source return source

View File

@@ -0,0 +1,322 @@
"""SQLite database connection wrapper.
Provides a thread-safe, WAL-mode SQLite connection shared by all stores.
Each entity table uses the same schema: indexed columns for common queries
plus a JSON blob for the full entity data.
"""
import json
import sqlite3
import threading
from contextlib import contextmanager
from pathlib import Path
from typing import Any, Dict, List, Tuple
from wled_controller.utils import get_logger
logger = get_logger(__name__)
# When True, all database writes are suppressed. Set by the restore flow
# to prevent the old server process from overwriting freshly-restored data
# with stale in-memory state before the restart completes.
_writes_frozen = False
def freeze_writes() -> None:
"""Block all database writes until the process exits (used after restore)."""
global _writes_frozen
_writes_frozen = True
logger.info("Database writes frozen - awaiting server restart")
def is_writes_frozen() -> bool:
"""Check whether writes are currently frozen."""
return _writes_frozen
# Schema version — bump when tables change
_SCHEMA_VERSION = 1
# All entity tables share this structure
_ENTITY_TABLES = [
"devices",
"capture_templates",
"postprocessing_templates",
"picture_sources",
"output_targets",
"pattern_templates",
"color_strip_sources",
"audio_sources",
"audio_templates",
"value_sources",
"automations",
"scene_presets",
"sync_clocks",
"color_strip_processing_templates",
"gradients",
"weather_sources",
]
class Database:
"""Thread-safe SQLite connection wrapper with WAL mode.
All stores share a single Database instance. The connection uses
WAL journaling for concurrent read access and a single writer lock.
"""
def __init__(self, db_path: str | Path):
self._path = Path(db_path)
self._path.parent.mkdir(parents=True, exist_ok=True)
self._conn = sqlite3.connect(
str(self._path),
check_same_thread=False,
)
self._conn.row_factory = sqlite3.Row
self._conn.execute("PRAGMA journal_mode=WAL")
self._conn.execute("PRAGMA busy_timeout=5000")
self._lock = threading.RLock()
self._ensure_schema()
logger.info(f"Database opened: {self._path}")
# -- Schema management ---------------------------------------------------
def _ensure_schema(self) -> None:
"""Create tables if they don't exist."""
with self._lock:
# Schema version tracking
self._conn.execute("""
CREATE TABLE IF NOT EXISTS schema_version (
version INTEGER PRIMARY KEY,
applied_at TEXT NOT NULL
)
""")
# Key-value settings table
self._conn.execute("""
CREATE TABLE IF NOT EXISTS settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL
)
""")
# Create entity tables
for table in _ENTITY_TABLES:
self._conn.execute(f"""
CREATE TABLE IF NOT EXISTS [{table}] (
id TEXT PRIMARY KEY,
name TEXT NOT NULL DEFAULT '',
data TEXT NOT NULL
)
""")
self._conn.execute(
f"CREATE INDEX IF NOT EXISTS idx_{table}_name ON [{table}](name)"
)
# Record schema version
existing = self._conn.execute(
"SELECT version FROM schema_version WHERE version = ?",
(_SCHEMA_VERSION,),
).fetchone()
if not existing:
from datetime import datetime, timezone
self._conn.execute(
"INSERT OR IGNORE INTO schema_version (version, applied_at) VALUES (?, ?)",
(_SCHEMA_VERSION, datetime.now(timezone.utc).isoformat()),
)
self._conn.commit()
# -- Low-level operations ------------------------------------------------
def execute(self, sql: str, params: Tuple = ()) -> sqlite3.Cursor:
"""Execute a single SQL statement (auto-commits)."""
with self._lock:
cursor = self._conn.execute(sql, params)
self._conn.commit()
return cursor
def execute_many(self, sql: str, params_list: List[Tuple]) -> None:
"""Execute a parameterized statement for each params tuple."""
with self._lock:
self._conn.executemany(sql, params_list)
self._conn.commit()
@contextmanager
def transaction(self):
"""Context manager for multi-statement transactions.
Usage::
with db.transaction() as conn:
conn.execute("INSERT ...", (...))
conn.execute("DELETE ...", (...))
# auto-committed on exit, rolled back on exception
"""
with self._lock:
try:
yield self._conn
self._conn.commit()
except Exception:
self._conn.rollback()
raise
# -- Entity helpers (used by BaseSqliteStore) ----------------------------
def load_all(self, table: str) -> List[Dict[str, Any]]:
"""Load all rows from an entity table.
Returns list of dicts parsed from the ``data`` JSON column.
"""
with self._lock:
rows = self._conn.execute(
f"SELECT id, data FROM [{table}]"
).fetchall()
result = []
for row in rows:
try:
item = json.loads(row["data"])
result.append(item)
except json.JSONDecodeError as e:
logger.error(f"Corrupt JSON in {table}/{row['id']}: {e}")
return result
def upsert(self, table: str, item_id: str, name: str, data: dict) -> None:
"""Insert or replace a single entity row.
Skipped silently when writes are frozen.
"""
if _writes_frozen:
return
json_data = json.dumps(data, ensure_ascii=False)
with self._lock:
self._conn.execute(
f"INSERT OR REPLACE INTO [{table}] (id, name, data) VALUES (?, ?, ?)",
(item_id, name, json_data),
)
self._conn.commit()
def delete_row(self, table: str, item_id: str) -> None:
"""Delete a single entity row.
Skipped silently when writes are frozen.
"""
if _writes_frozen:
return
with self._lock:
self._conn.execute(
f"DELETE FROM [{table}] WHERE id = ?", (item_id,)
)
self._conn.commit()
def delete_all(self, table: str) -> None:
"""Delete all rows from an entity table.
Skipped silently when writes are frozen.
"""
if _writes_frozen:
return
with self._lock:
self._conn.execute(f"DELETE FROM [{table}]")
self._conn.commit()
def bulk_insert(self, table: str, items: List[Tuple[str, str, str]]) -> None:
"""Bulk insert rows: list of (id, name, data_json) tuples.
Skipped silently when writes are frozen.
"""
if _writes_frozen:
return
with self._lock:
self._conn.executemany(
f"INSERT OR REPLACE INTO [{table}] (id, name, data) VALUES (?, ?, ?)",
items,
)
self._conn.commit()
def count(self, table: str) -> int:
"""Count rows in an entity table."""
with self._lock:
row = self._conn.execute(
f"SELECT COUNT(*) as cnt FROM [{table}]"
).fetchone()
return row["cnt"]
def table_exists_with_data(self, table: str) -> bool:
"""Check if a table exists and has at least one row."""
with self._lock:
try:
row = self._conn.execute(
f"SELECT COUNT(*) as cnt FROM [{table}]"
).fetchone()
return row["cnt"] > 0
except sqlite3.OperationalError:
return False
# -- Settings (key-value) ------------------------------------------------
def get_setting(self, key: str) -> dict | None:
"""Read a setting by key. Returns parsed JSON dict, or None if not found."""
with self._lock:
row = self._conn.execute(
"SELECT value FROM settings WHERE key = ?", (key,)
).fetchone()
if row is None:
return None
try:
return json.loads(row["value"])
except json.JSONDecodeError:
return None
def set_setting(self, key: str, value: dict) -> None:
"""Write a setting (upsert). Skipped when writes are frozen."""
if _writes_frozen:
return
json_value = json.dumps(value, ensure_ascii=False)
with self._lock:
self._conn.execute(
"INSERT OR REPLACE INTO settings (key, value) VALUES (?, ?)",
(key, json_value),
)
self._conn.commit()
# -- Backup --------------------------------------------------------------
def backup_to(self, dest_path: str | Path) -> None:
"""Create a consistent snapshot of the database using SQLite's backup API.
Safe to call while the database is in use — SQLite handles locking.
"""
dest_path = Path(dest_path)
dest_path.parent.mkdir(parents=True, exist_ok=True)
with self._lock:
dest = sqlite3.connect(str(dest_path))
try:
self._conn.backup(dest)
finally:
dest.close()
def restore_from(self, src_path: str | Path) -> None:
"""Replace the database contents from a backup file.
The caller must restart the server after calling this — in-memory
caches in stores will be stale.
"""
src_path = Path(src_path)
if not src_path.exists():
raise FileNotFoundError(f"Backup file not found: {src_path}")
with self._lock:
src = sqlite3.connect(str(src_path))
try:
src.backup(self._conn)
finally:
src.close()
# -- Lifecycle -----------------------------------------------------------
def close(self) -> None:
"""Close the database connection."""
with self._lock:
self._conn.close()
logger.info("Database connection closed")

View File

@@ -1,12 +1,11 @@
"""Device storage using JSON files.""" """Device storage using SQLite."""
import json
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from pathlib import Path
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
@@ -190,14 +189,14 @@ _UPDATABLE_FIELDS: frozenset[str] = frozenset({
}) })
class DeviceStore(BaseJsonStore[Device]): class DeviceStore(BaseSqliteStore[Device]):
"""Persistent storage for WLED devices.""" """Persistent storage for WLED devices."""
_json_key = "devices" _table_name = "devices"
_entity_name = "Device" _entity_name = "Device"
def __init__(self, storage_file: str | Path): def __init__(self, db: Database):
super().__init__(file_path=str(storage_file), deserializer=Device.from_dict) super().__init__(db, Device.from_dict)
logger.info(f"Device store initialized with {len(self._items)} devices") logger.info(f"Device store initialized with {len(self._items)} devices")
# ── Backward-compat aliases ────────────────────────────────── # ── Backward-compat aliases ──────────────────────────────────
@@ -278,7 +277,7 @@ class DeviceStore(BaseJsonStore[Device]):
) )
self._items[device_id] = device self._items[device_id] = device
self._save() self._save_item(device_id, device)
logger.info(f"Created device {device_id}: {name}") logger.info(f"Created device {device_id}: {name}")
return device return device
@@ -316,7 +315,7 @@ class DeviceStore(BaseJsonStore[Device]):
new_device = Device(**device_fields) new_device = Device(**device_fields)
self._items[device_id] = new_device self._items[device_id] = new_device
self._save() self._save_item(device_id, new_device)
logger.info(f"Updated device {device_id}") logger.info(f"Updated device {device_id}")
return new_device return new_device
@@ -330,15 +329,5 @@ class DeviceStore(BaseJsonStore[Device]):
def clear(self): def clear(self):
"""Clear all devices (for testing).""" """Clear all devices (for testing)."""
self._items.clear() self._items.clear()
self._save() self._db.delete_all(self._table_name)
logger.warning("Cleared all devices from storage") logger.warning("Cleared all devices from storage")
def load_raw(self) -> dict:
"""Load raw JSON data from storage (for migration)."""
if not self.file_path.exists():
return {}
try:
with open(self.file_path, "r") as f:
return json.load(f)
except Exception:
return {}

View File

@@ -1,6 +1,6 @@
"""Gradient storage with built-in seeding. """Gradient storage with built-in seeding.
Provides CRUD for gradient entities. On first run (empty/missing file), Provides CRUD for gradient entities. On first run (empty/missing data),
seeds 8 built-in gradients matching the legacy hardcoded palettes. seeds 8 built-in gradients matching the legacy hardcoded palettes.
Built-in gradients are read-only and cannot be deleted or modified. Built-in gradients are read-only and cannot be deleted or modified.
""" """
@@ -9,7 +9,8 @@ import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.gradient import Gradient from wled_controller.storage.gradient import Gradient
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
@@ -43,12 +44,12 @@ def _tuples_to_stops(tuples: list) -> list:
return [{"position": t[0], "color": [t[1], t[2], t[3]]} for t in tuples] return [{"position": t[0], "color": [t[1], t[2], t[3]]} for t in tuples]
class GradientStore(BaseJsonStore[Gradient]): class GradientStore(BaseSqliteStore[Gradient]):
_json_key = "gradients" _table_name = "gradients"
_entity_name = "Gradient" _entity_name = "Gradient"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, Gradient.from_dict) super().__init__(db, Gradient.from_dict)
if not self._items: if not self._items:
self._seed_builtins() self._seed_builtins()
@@ -70,7 +71,7 @@ class GradientStore(BaseJsonStore[Gradient]):
logger.info(f"Seeded {len(_BUILTIN_DEFS)} built-in gradients") logger.info(f"Seeded {len(_BUILTIN_DEFS)} built-in gradients")
# Aliases # Aliases
get_all_gradients = BaseJsonStore.get_all get_all_gradients = BaseSqliteStore.get_all
def get_gradient(self, gradient_id: str) -> Gradient: def get_gradient(self, gradient_id: str) -> Gradient:
return self.get(gradient_id) return self.get(gradient_id)
@@ -104,7 +105,7 @@ class GradientStore(BaseJsonStore[Gradient]):
tags=tags or [], tags=tags or [],
) )
self._items[gid] = gradient self._items[gid] = gradient
self._save() self._save_item(gid, gradient)
logger.info(f"Created gradient: {name} ({gid})") logger.info(f"Created gradient: {name} ({gid})")
return gradient return gradient
@@ -129,7 +130,7 @@ class GradientStore(BaseJsonStore[Gradient]):
if tags is not None: if tags is not None:
gradient.tags = tags gradient.tags = tags
gradient.updated_at = datetime.now(timezone.utc) gradient.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(gradient_id, gradient)
logger.info(f"Updated gradient: {gradient_id}") logger.info(f"Updated gradient: {gradient_id}")
return gradient return gradient

View File

@@ -1,10 +1,11 @@
"""Output target storage using JSON files.""" """Output target storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.output_target import OutputTarget from wled_controller.storage.output_target import OutputTarget
from wled_controller.storage.wled_output_target import WledOutputTarget from wled_controller.storage.wled_output_target import WledOutputTarget
from wled_controller.storage.key_colors_output_target import ( from wled_controller.storage.key_colors_output_target import (
@@ -18,20 +19,19 @@ logger = get_logger(__name__)
DEFAULT_STATE_CHECK_INTERVAL = 30 # seconds DEFAULT_STATE_CHECK_INTERVAL = 30 # seconds
class OutputTargetStore(BaseJsonStore[OutputTarget]): class OutputTargetStore(BaseSqliteStore[OutputTarget]):
"""Persistent storage for output targets.""" """Persistent storage for output targets."""
_json_key = "output_targets" _table_name = "output_targets"
_entity_name = "Output target" _entity_name = "Output target"
_legacy_json_keys = ["picture_targets"]
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, OutputTarget.from_dict) super().__init__(db, OutputTarget.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_targets = BaseJsonStore.get_all get_all_targets = BaseSqliteStore.get_all
get_target = BaseJsonStore.get get_target = BaseSqliteStore.get
delete_target = BaseJsonStore.delete delete_target = BaseSqliteStore.delete
def create_target( def create_target(
self, self,
@@ -101,7 +101,7 @@ class OutputTargetStore(BaseJsonStore[OutputTarget]):
target.tags = tags or [] target.tags = tags or []
self._items[target_id] = target self._items[target_id] = target
self._save() self._save_item(target_id, target)
logger.info(f"Created output target: {name} ({target_id}, type={target_type})") logger.info(f"Created output target: {name} ({target_id}, type={target_type})")
return target return target
@@ -156,7 +156,7 @@ class OutputTargetStore(BaseJsonStore[OutputTarget]):
) )
target.updated_at = datetime.now(timezone.utc) target.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(target_id, target)
logger.info(f"Updated output target: {target_id}") logger.info(f"Updated output target: {target_id}")
return target return target

View File

@@ -1,10 +1,11 @@
"""Pattern template storage using JSON files.""" """Pattern template storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.key_colors_output_target import KeyColorRectangle from wled_controller.storage.key_colors_output_target import KeyColorRectangle
from wled_controller.storage.pattern_template import PatternTemplate from wled_controller.storage.pattern_template import PatternTemplate
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
@@ -12,24 +13,24 @@ from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class PatternTemplateStore(BaseJsonStore[PatternTemplate]): class PatternTemplateStore(BaseSqliteStore[PatternTemplate]):
"""Storage for pattern templates (rectangle layouts for key color extraction). """Storage for pattern templates (rectangle layouts for key color extraction).
All templates are persisted to the JSON file. All templates are persisted to the database.
On startup, if no templates exist, a default one is auto-created. On startup, if no templates exist, a default one is auto-created.
""" """
_json_key = "pattern_templates" _table_name = "pattern_templates"
_entity_name = "Pattern template" _entity_name = "Pattern template"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, PatternTemplate.from_dict) super().__init__(db, PatternTemplate.from_dict)
self._ensure_initial_template() self._ensure_initial_template()
# Backward-compatible aliases # Backward-compatible aliases
get_all_templates = BaseJsonStore.get_all get_all_templates = BaseSqliteStore.get_all
get_template = BaseJsonStore.get get_template = BaseSqliteStore.get
delete_template = BaseJsonStore.delete delete_template = BaseSqliteStore.delete
def _ensure_initial_template(self) -> None: def _ensure_initial_template(self) -> None:
"""Auto-create a default pattern template if none exist.""" """Auto-create a default pattern template if none exist."""
@@ -80,7 +81,7 @@ class PatternTemplateStore(BaseJsonStore[PatternTemplate]):
) )
self._items[template_id] = template self._items[template_id] = template
self._save() self._save_item(template_id, template)
logger.info(f"Created pattern template: {name} ({template_id})") logger.info(f"Created pattern template: {name} ({template_id})")
return template return template
@@ -106,7 +107,7 @@ class PatternTemplateStore(BaseJsonStore[PatternTemplate]):
template.tags = tags template.tags = tags
template.updated_at = datetime.now(timezone.utc) template.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(template_id, template)
logger.info(f"Updated pattern template: {template_id}") logger.info(f"Updated pattern template: {template_id}")
return template return template

View File

@@ -1,10 +1,11 @@
"""Picture source storage using JSON files.""" """Picture source storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional, Set from typing import List, Optional, Set
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.utils import resolve_ref from wled_controller.storage.utils import resolve_ref
from wled_controller.storage.picture_source import ( from wled_controller.storage.picture_source import (
PictureSource, PictureSource,
@@ -18,26 +19,26 @@ from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class PictureSourceStore(BaseJsonStore[PictureSource]): class PictureSourceStore(BaseSqliteStore[PictureSource]):
"""Storage for picture sources. """Storage for picture sources.
Supports raw and processed stream types with cycle detection Supports raw and processed stream types with cycle detection
for processed streams that reference other streams. for processed streams that reference other streams.
""" """
_json_key = "picture_sources" _table_name = "picture_sources"
_entity_name = "Picture source" _entity_name = "Picture source"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, PictureSource.from_dict) super().__init__(db, PictureSource.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_sources = BaseJsonStore.get_all get_all_sources = BaseSqliteStore.get_all
get_source = BaseJsonStore.get get_source = BaseSqliteStore.get
# Legacy aliases (old code used "stream" naming) # Legacy aliases (old code used "stream" naming)
get_all_streams = BaseJsonStore.get_all get_all_streams = BaseSqliteStore.get_all
get_stream = BaseJsonStore.get get_stream = BaseSqliteStore.get
# ── Helpers ─────────────────────────────────────────────────────── # ── Helpers ───────────────────────────────────────────────────────
@@ -171,7 +172,7 @@ class PictureSourceStore(BaseJsonStore[PictureSource]):
) )
self._items[stream_id] = stream self._items[stream_id] = stream
self._save() self._save_item(stream_id, stream)
logger.info(f"Created picture source: {name} ({stream_id}, type={stream_type})") logger.info(f"Created picture source: {name} ({stream_id}, type={stream_type})")
return stream return stream
@@ -255,7 +256,7 @@ class PictureSourceStore(BaseJsonStore[PictureSource]):
stream.updated_at = datetime.now(timezone.utc) stream.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(stream_id, stream)
logger.info(f"Updated picture source: {stream_id}") logger.info(f"Updated picture source: {stream_id}")
return stream return stream
@@ -278,7 +279,7 @@ class PictureSourceStore(BaseJsonStore[PictureSource]):
) )
del self._items[stream_id] del self._items[stream_id]
self._save() self._delete_item(stream_id)
logger.info(f"Deleted picture source: {stream_id}") logger.info(f"Deleted picture source: {stream_id}")

View File

@@ -1,4 +1,4 @@
"""Postprocessing template storage using JSON files.""" """Postprocessing template storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
@@ -6,7 +6,8 @@ from typing import List, Optional
from wled_controller.core.filters.filter_instance import FilterInstance from wled_controller.core.filters.filter_instance import FilterInstance
from wled_controller.core.filters.registry import FilterRegistry from wled_controller.core.filters.registry import FilterRegistry
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.picture_source import ProcessedPictureSource from wled_controller.storage.picture_source import ProcessedPictureSource
from wled_controller.storage.postprocessing_template import PostprocessingTemplate from wled_controller.storage.postprocessing_template import PostprocessingTemplate
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
@@ -14,25 +15,25 @@ from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class PostprocessingTemplateStore(BaseJsonStore[PostprocessingTemplate]): class PostprocessingTemplateStore(BaseSqliteStore[PostprocessingTemplate]):
"""Storage for postprocessing templates. """Storage for postprocessing templates.
All templates are persisted to the JSON file. All templates are persisted to the database.
On startup, if no templates exist, a default one is auto-created. On startup, if no templates exist, a default one is auto-created.
""" """
_json_key = "postprocessing_templates" _table_name = "postprocessing_templates"
_entity_name = "Postprocessing template" _entity_name = "Postprocessing template"
_version = "2.0.0" _version = "2.0.0"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, PostprocessingTemplate.from_dict) super().__init__(db, PostprocessingTemplate.from_dict)
self._ensure_initial_template() self._ensure_initial_template()
# Backward-compatible aliases # Backward-compatible aliases
get_all_templates = BaseJsonStore.get_all get_all_templates = BaseSqliteStore.get_all
get_template = BaseJsonStore.get get_template = BaseSqliteStore.get
delete_template = BaseJsonStore.delete delete_template = BaseSqliteStore.delete
def _ensure_initial_template(self) -> None: def _ensure_initial_template(self) -> None:
"""Auto-create a default postprocessing template if none exist.""" """Auto-create a default postprocessing template if none exist."""
@@ -90,7 +91,7 @@ class PostprocessingTemplateStore(BaseJsonStore[PostprocessingTemplate]):
) )
self._items[template_id] = template self._items[template_id] = template
self._save() self._save_item(template_id, template)
logger.info(f"Created postprocessing template: {name} ({template_id})") logger.info(f"Created postprocessing template: {name} ({template_id})")
return template return template
@@ -120,7 +121,7 @@ class PostprocessingTemplateStore(BaseJsonStore[PostprocessingTemplate]):
template.tags = tags template.tags = tags
template.updated_at = datetime.now(timezone.utc) template.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(template_id, template)
logger.info(f"Updated postprocessing template: {template_id}") logger.info(f"Updated postprocessing template: {template_id}")
return template return template

View File

@@ -1,27 +1,28 @@
"""Scene preset storage using JSON files.""" """Scene preset storage using SQLite."""
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.scene_preset import ScenePreset, TargetSnapshot from wled_controller.storage.scene_preset import ScenePreset, TargetSnapshot
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class ScenePresetStore(BaseJsonStore[ScenePreset]): class ScenePresetStore(BaseSqliteStore[ScenePreset]):
"""Persistent storage for scene presets.""" """Persistent storage for scene presets."""
_json_key = "scene_presets" _table_name = "scene_presets"
_entity_name = "Scene preset" _entity_name = "Scene preset"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, ScenePreset.from_dict) super().__init__(db, ScenePreset.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_preset = BaseJsonStore.get get_preset = BaseSqliteStore.get
delete_preset = BaseJsonStore.delete delete_preset = BaseSqliteStore.delete
def get_all_presets(self) -> List[ScenePreset]: def get_all_presets(self) -> List[ScenePreset]:
"""Get all presets sorted by order field.""" """Get all presets sorted by order field."""
@@ -35,7 +36,7 @@ class ScenePresetStore(BaseJsonStore[ScenePreset]):
self._check_name_unique(preset.name) self._check_name_unique(preset.name)
self._items[preset.id] = preset self._items[preset.id] = preset
self._save() self._save_item(preset.id, preset)
logger.info(f"Created scene preset: {preset.name} ({preset.id})") logger.info(f"Created scene preset: {preset.name} ({preset.id})")
return preset return preset
@@ -63,7 +64,7 @@ class ScenePresetStore(BaseJsonStore[ScenePreset]):
preset.tags = tags preset.tags = tags
preset.updated_at = datetime.now(timezone.utc) preset.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(preset_id, preset)
logger.info(f"Updated scene preset: {preset_id}") logger.info(f"Updated scene preset: {preset_id}")
return preset return preset
@@ -73,6 +74,6 @@ class ScenePresetStore(BaseJsonStore[ScenePreset]):
existing.targets = preset.targets existing.targets = preset.targets
existing.updated_at = datetime.now(timezone.utc) existing.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(preset_id, existing)
logger.info(f"Recaptured scene preset: {preset_id}") logger.info(f"Recaptured scene preset: {preset_id}")
return existing return existing

View File

@@ -1,27 +1,28 @@
"""Synchronization clock storage using JSON files.""" """Synchronization clock storage."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.sync_clock import SyncClock from wled_controller.storage.sync_clock import SyncClock
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class SyncClockStore(BaseJsonStore[SyncClock]): class SyncClockStore(BaseSqliteStore[SyncClock]):
_json_key = "sync_clocks" _table_name = "sync_clocks"
_entity_name = "Sync clock" _entity_name = "Sync clock"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, SyncClock.from_dict) super().__init__(db, SyncClock.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_clocks = BaseJsonStore.get_all get_all_clocks = BaseSqliteStore.get_all
get_clock = BaseJsonStore.get get_clock = BaseSqliteStore.get
delete_clock = BaseJsonStore.delete delete_clock = BaseSqliteStore.delete
def create_clock( def create_clock(
self, self,
@@ -45,7 +46,7 @@ class SyncClockStore(BaseJsonStore[SyncClock]):
) )
self._items[cid] = clock self._items[cid] = clock
self._save() self._save_item(cid, clock)
logger.info(f"Created sync clock: {name} ({cid}, speed={clock.speed})") logger.info(f"Created sync clock: {name} ({cid}, speed={clock.speed})")
return clock return clock
@@ -70,6 +71,6 @@ class SyncClockStore(BaseJsonStore[SyncClock]):
clock.tags = tags clock.tags = tags
clock.updated_at = datetime.now(timezone.utc) clock.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(clock_id, clock)
logger.info(f"Updated sync clock: {clock_id}") logger.info(f"Updated sync clock: {clock_id}")
return clock return clock

View File

@@ -1,36 +1,37 @@
"""Template storage using JSON files.""" """Template storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import Any, Dict, List, Optional from typing import Any, Dict, List, Optional
from wled_controller.core.capture_engines.factory import EngineRegistry from wled_controller.core.capture_engines.factory import EngineRegistry
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.template import CaptureTemplate from wled_controller.storage.template import CaptureTemplate
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class TemplateStore(BaseJsonStore[CaptureTemplate]): class TemplateStore(BaseSqliteStore[CaptureTemplate]):
"""Storage for capture templates. """Storage for capture templates.
All templates are persisted to the JSON file. All templates are persisted to the database.
On startup, if no templates exist, one is auto-created using the On startup, if no templates exist, one is auto-created using the
highest-priority available engine. highest-priority available engine.
""" """
_json_key = "templates" _table_name = "capture_templates"
_entity_name = "Capture template" _entity_name = "Capture template"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, CaptureTemplate.from_dict) super().__init__(db, CaptureTemplate.from_dict)
self._ensure_initial_template() self._ensure_initial_template()
# Backward-compatible aliases # Backward-compatible aliases
get_all_templates = BaseJsonStore.get_all get_all_templates = BaseSqliteStore.get_all
get_template = BaseJsonStore.get get_template = BaseSqliteStore.get
delete_template = BaseJsonStore.delete delete_template = BaseSqliteStore.delete
def _ensure_initial_template(self) -> None: def _ensure_initial_template(self) -> None:
"""Auto-create a template if none exist, using the best available engine.""" """Auto-create a template if none exist, using the best available engine."""
@@ -85,7 +86,7 @@ class TemplateStore(BaseJsonStore[CaptureTemplate]):
) )
self._items[template_id] = template self._items[template_id] = template
self._save() self._save_item(template_id, template)
logger.info(f"Created template: {name} ({template_id})") logger.info(f"Created template: {name} ({template_id})")
return template return template
@@ -114,7 +115,7 @@ class TemplateStore(BaseJsonStore[CaptureTemplate]):
template.tags = tags template.tags = tags
template.updated_at = datetime.now(timezone.utc) template.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(template_id, template)
logger.info(f"Updated template: {template_id}") logger.info(f"Updated template: {template_id}")
return template return template

View File

@@ -1,10 +1,11 @@
"""Value source storage using JSON files.""" """Value source storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.utils import resolve_ref from wled_controller.storage.utils import resolve_ref
from wled_controller.storage.value_source import ( from wled_controller.storage.value_source import (
AdaptiveValueSource, AdaptiveValueSource,
@@ -19,19 +20,19 @@ from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class ValueSourceStore(BaseJsonStore[ValueSource]): class ValueSourceStore(BaseSqliteStore[ValueSource]):
"""Persistent storage for value sources.""" """Persistent storage for value sources."""
_json_key = "value_sources" _table_name = "value_sources"
_entity_name = "Value source" _entity_name = "Value source"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, ValueSource.from_dict) super().__init__(db, ValueSource.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_sources = BaseJsonStore.get_all get_all_sources = BaseSqliteStore.get_all
get_source = BaseJsonStore.get get_source = BaseSqliteStore.get
delete_source = BaseJsonStore.delete delete_source = BaseSqliteStore.delete
# ── CRUD ───────────────────────────────────────────────────────── # ── CRUD ─────────────────────────────────────────────────────────
@@ -128,7 +129,7 @@ class ValueSourceStore(BaseJsonStore[ValueSource]):
) )
self._items[sid] = source self._items[sid] = source
self._save() self._save_item(sid, source)
logger.info(f"Created value source: {name} ({sid}, type={source_type})") logger.info(f"Created value source: {name} ({sid}, type={source_type})")
return source return source
@@ -223,7 +224,7 @@ class ValueSourceStore(BaseJsonStore[ValueSource]):
source.max_value = max_value source.max_value = max_value
source.updated_at = datetime.now(timezone.utc) source.updated_at = datetime.now(timezone.utc)
self._save() self._save_item(source_id, source)
logger.info(f"Updated value source: {source_id}") logger.info(f"Updated value source: {source_id}")
return source return source

View File

@@ -1,29 +1,30 @@
"""Weather source storage using JSON files.""" """Weather source storage using SQLite."""
import uuid import uuid
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import List, Optional from typing import List, Optional
from wled_controller.storage.base_store import BaseJsonStore from wled_controller.storage.base_sqlite_store import BaseSqliteStore
from wled_controller.storage.database import Database
from wled_controller.storage.weather_source import WeatherSource from wled_controller.storage.weather_source import WeatherSource
from wled_controller.utils import get_logger from wled_controller.utils import get_logger
logger = get_logger(__name__) logger = get_logger(__name__)
class WeatherSourceStore(BaseJsonStore[WeatherSource]): class WeatherSourceStore(BaseSqliteStore[WeatherSource]):
"""Persistent storage for weather sources.""" """Persistent storage for weather sources."""
_json_key = "weather_sources" _table_name = "weather_sources"
_entity_name = "Weather source" _entity_name = "Weather source"
def __init__(self, file_path: str): def __init__(self, db: Database):
super().__init__(file_path, WeatherSource.from_dict) super().__init__(db, WeatherSource.from_dict)
# Backward-compatible aliases # Backward-compatible aliases
get_all_sources = BaseJsonStore.get_all get_all_sources = BaseSqliteStore.get_all
get_source = BaseJsonStore.get get_source = BaseSqliteStore.get
delete_source = BaseJsonStore.delete delete_source = BaseSqliteStore.delete
def create_source( def create_source(
self, self,
@@ -67,7 +68,7 @@ class WeatherSourceStore(BaseJsonStore[WeatherSource]):
) )
self._items[sid] = source self._items[sid] = source
self._save() self._save_item(sid, source)
logger.info(f"Created weather source: {name} ({sid})") logger.info(f"Created weather source: {name} ({sid})")
return source return source
@@ -115,6 +116,6 @@ class WeatherSourceStore(BaseJsonStore[WeatherSource]):
) )
self._items[source_id] = updated self._items[source_id] = updated
self._save() self._save_item(source_id, updated)
logger.info(f"Updated weather source: {updated.name} ({source_id})") logger.info(f"Updated weather source: {updated.name} ({source_id})")
return updated return updated

View File

@@ -92,47 +92,11 @@
<button type="button" class="hint-toggle" onclick="toggleHint(this)" title="?">?</button> <button type="button" class="hint-toggle" onclick="toggleHint(this)" title="?">?</button>
</div> </div>
<small class="input-hint" style="display:none" data-i18n="settings.restore.hint">Upload a previously downloaded backup file to replace all configuration. The server will restart automatically.</small> <small class="input-hint" style="display:none" data-i18n="settings.restore.hint">Upload a previously downloaded backup file to replace all configuration. The server will restart automatically.</small>
<input type="file" id="settings-restore-input" accept=".json" style="display:none" onchange="handleRestoreFileSelected(this)"> <input type="file" id="settings-restore-input" accept=".db" style="display:none" onchange="handleRestoreFileSelected(this)">
<button class="btn btn-danger" onclick="document.getElementById('settings-restore-input').click()" style="width:100%" data-i18n="settings.restore.button">Restore from Backup</button> <button class="btn btn-danger" onclick="document.getElementById('settings-restore-input').click()" style="width:100%" data-i18n="settings.restore.button">Restore from Backup</button>
</div> </div>
<!-- Partial Export/Import section --> <!-- Partial Export/Import section -->
<div class="form-group">
<div class="label-row">
<label data-i18n="settings.partial.label">Partial Export / Import</label>
<button type="button" class="hint-toggle" onclick="toggleHint(this)" title="?">?</button>
</div>
<small class="input-hint" style="display:none" data-i18n="settings.partial.hint">Export or import a single entity type. Import replaces or merges existing data and restarts the server.</small>
<div style="display:flex;gap:0.5rem;margin-bottom:0.5rem;">
<select id="settings-partial-store" style="flex:1">
<option value="devices" data-i18n="settings.partial.store.devices">Devices</option>
<option value="output_targets" data-i18n="settings.partial.store.output_targets">LED Targets</option>
<option value="color_strip_sources" data-i18n="settings.partial.store.color_strip_sources">Color Strips</option>
<option value="picture_sources" data-i18n="settings.partial.store.picture_sources">Picture Sources</option>
<option value="audio_sources" data-i18n="settings.partial.store.audio_sources">Audio Sources</option>
<option value="audio_templates" data-i18n="settings.partial.store.audio_templates">Audio Templates</option>
<option value="capture_templates" data-i18n="settings.partial.store.capture_templates">Capture Templates</option>
<option value="postprocessing_templates" data-i18n="settings.partial.store.postprocessing_templates">Post-processing Templates</option>
<option value="color_strip_processing_templates" data-i18n="settings.partial.store.color_strip_processing_templates">CSS Processing Templates</option>
<option value="pattern_templates" data-i18n="settings.partial.store.pattern_templates">Pattern Templates</option>
<option value="value_sources" data-i18n="settings.partial.store.value_sources">Value Sources</option>
<option value="sync_clocks" data-i18n="settings.partial.store.sync_clocks">Sync Clocks</option>
<option value="automations" data-i18n="settings.partial.store.automations">Automations</option>
<option value="scene_presets" data-i18n="settings.partial.store.scene_presets">Scene Presets</option>
</select>
<button class="btn btn-secondary" onclick="downloadPartialExport()" data-i18n="settings.partial.export_button">Export</button>
</div>
<div style="display:flex;align-items:center;gap:0.5rem;margin-bottom:0.5rem;">
<input type="checkbox" id="settings-partial-merge">
<label for="settings-partial-merge" style="margin:0;font-size:0.85rem;" data-i18n="settings.partial.merge_label">Merge (add/overwrite, keep existing)</label>
</div>
<input type="file" id="settings-partial-import-input" accept=".json" style="display:none" onchange="handlePartialImportFileSelected(this)">
<button class="btn btn-secondary" onclick="document.getElementById('settings-partial-import-input').click()" style="width:100%" data-i18n="settings.partial.import_button">Import from File</button>
</div>
<!-- Auto-Backup section --> <!-- Auto-Backup section -->
<div class="form-group"> <div class="form-group">
<div class="label-row"> <div class="label-row">
@@ -164,7 +128,10 @@
</div> </div>
</div> </div>
<button class="btn btn-primary" onclick="saveAutoBackupSettings()" style="width:100%" data-i18n="settings.auto_backup.save">Save Settings</button> <div style="display:flex; gap:0.5rem;">
<button class="btn btn-primary" onclick="saveAutoBackupSettings()" style="flex:1" data-i18n="settings.auto_backup.save">Save Settings</button>
<button class="btn btn-secondary" onclick="triggerBackupNow()" style="flex:1" data-i18n="settings.auto_backup.backup_now">Backup Now</button>
</div>
<div id="auto-backup-status" style="font-size:0.85rem; color:var(--text-muted); margin-top:0.5rem;"></div> <div id="auto-backup-status" style="font-size:0.85rem; color:var(--text-muted); margin-top:0.5rem;"></div>
</div> </div>