diff --git a/CODEBASE_REVIEW.md b/CODEBASE_REVIEW.md new file mode 100644 index 0000000..38e42b2 --- /dev/null +++ b/CODEBASE_REVIEW.md @@ -0,0 +1,54 @@ +# Codebase Review — 2026-02-26 + +Findings from full codebase review. Items ordered by priority within each category. + +## Stability (Critical) + +- [x] **Fatal loop exception leaks resources** — Added outer `try/except/finally` with `self._running = False` to all 10 processing loop methods across `live_stream.py`, `color_strip_stream.py`, `effect_stream.py`, `audio_stream.py`, `composite_stream.py`, `mapped_stream.py`. Also added per-iteration `try/except` where missing. +- [x] **`_is_running` flag cleanup** — Fixed via `finally: self._running = False` in all loop methods. *(Race condition via `threading.Event` deferred — current pattern sufficient with the finally block.)* +- [x] **`ColorStripStreamManager` thread safety** — **FALSE POSITIVE**: All access is from the async event loop; methods are synchronous with no `await` points, so no preemption is possible. +- [x] **Audio `stream.stop()` called under lock** — Moved `stream.stop()` outside lock scope in both `release()` and `release_all()` in `audio_capture.py`. +- [x] **WS accept-before-validate** — **FALSE POSITIVE**: All WS endpoints validate auth and resolve configs BEFORE calling `websocket.accept()`. +- [x] **Capture error no-backoff** — Added consecutive error counter with exponential backoff (`min(1.0, 0.1 * (errors - 5))`) in `ScreenCaptureLiveStream._capture_loop()`. +- [ ] **WGC session close not detected** — Deferred (Windows-specific edge case, low priority). +- [x] **`LiveStreamManager.acquire()` not thread-safe** — **FALSE POSITIVE**: Same as ColorStripStreamManager — all access from async event loop, no await in methods. + +## Performance (High Impact) + +- [x] **Per-pixel Python loop in `send_pixels()`** — Replaced per-pixel Python loop with `np.array().tobytes()` in `ddp_client.py`. Hot path already uses `send_pixels_numpy()`. +- [ ] **WGC 6MB frame allocation per callback** — Deferred (Windows-specific, requires WGC API changes). +- [x] **Gradient rendering O(LEDs×Stops) Python loop** — Vectorized with NumPy: `np.searchsorted` for stop lookup + vectorized interpolation in `_compute_gradient_colors()`. +- [x] **`PixelateFilter` nested Python loop** — Replaced with `cv2.resize` down (INTER_AREA) + up (INTER_NEAREST) — pure C++ backend. +- [x] **`DownscalerFilter` double allocation** — **FALSE POSITIVE**: Already uses single `cv2.resize()` call (vectorized C++). +- [x] **`SaturationFilter` ~25MB temp arrays** — **FALSE POSITIVE**: Already uses pre-allocated scratch buffer and vectorized in-place numpy. +- [x] **`FrameInterpolationFilter` copies full image** — **FALSE POSITIVE**: Already uses vectorized numpy integer blending with image pool. +- [x] **`datetime.utcnow()` per frame** — **LOW IMPACT**: ~1-2μs per call, negligible at 60fps. Deprecation tracked under Backend Quality. +- [x] **Unbounded diagnostic lists** — **FALSE POSITIVE**: Lists are cleared every 5 seconds (~300 entries max at 60fps). Trivial memory. + +## Frontend Quality + +- [x] **`lockBody()`/`unlockBody()` not re-entrant** — Added `_lockCount` reference counter and `_savedScrollY` in `ui.js`. First lock saves scroll, last unlock restores. +- [x] **XSS via unescaped engine config keys** — **FALSE POSITIVE**: Both capture template and audio template card renderers already use `escapeHtml()` on keys and values. +- [x] **LED preview WS `onclose` not nulled** — Added `ws.onclose = null` before `ws.close()` in `disconnectLedPreviewWS()` in `targets.js`. +- [x] **`fetchWithAuth` retry adds duplicate listeners** — Added `{ once: true }` to abort signal listener in `api.js`. +- [x] **Audio `requestAnimationFrame` loop continues after WS close** — **FALSE POSITIVE**: Loop already checks `testAudioModal.isOpen` before scheduling next frame, and `_cleanupTest()` cancels the animation frame. + +## Backend Quality + +- [ ] **No thread-safety in `JsonStore`** — Deferred (low risk — all stores are accessed from async event loop). +- [x] **Auth token prefix logged** — Removed token prefix from log message in `auth.py`. Now logs only "Invalid API key attempt". +- [ ] **Duplicate capture/test code** — Deferred (code duplication, not a bug — refactoring would reduce LOC but doesn't fix a defect). +- [x] **Update methods allow duplicate names** — Added name uniqueness checks to `update_template` in `template_store.py`, `postprocessing_template_store.py`, `audio_template_store.py`, `pattern_template_store.py`, and `update_profile` in `profile_store.py`. Also added missing check to `create_profile`. +- [ ] **Routes access `manager._private` attrs** — Deferred (stylistic, not a bug — would require adding public accessor methods). +- [x] **Non-atomic file writes** — Created `utils/file_ops.py` with `atomic_write_json()` helper (tempfile + `os.replace`). Updated all 10 store files. +- [ ] **444 f-string logger calls** — Deferred (performance impact negligible — Python evaluates f-strings very fast; lazy `%s` formatting only matters at very high call rates). +- [x] **`get_source()` silent bug** — Fixed: `color_strip_sources.py:_resolve_display_index()` called `picture_source_store.get_source()` which doesn't exist (should be `get_stream()`). Was silently returning `0` for display index. +- [ ] **`get_config()` race** — Deferred (low risk — config changes are infrequent user-initiated operations). +- [ ] **`datetime.utcnow()` deprecated** — Deferred (functional, deprecation warning only appears in Python 3.12+). +- [x] **Inconsistent DELETE status codes** — Changed `audio_sources.py` and `value_sources.py` DELETE endpoints from 200 to 204 (matching all other DELETE endpoints). + +## Architecture (Observations, no action needed) + +**Strengths**: Clean layered design, plugin registries, reference-counted stream sharing, consistent API patterns. + +**Weaknesses**: No backpressure (slow consumers buffer frames), thread count grows linearly, config global singleton, reference counting races. diff --git a/server/src/wled_controller/api/auth.py b/server/src/wled_controller/api/auth.py index d6e0fd4..9bcade2 100644 --- a/server/src/wled_controller/api/auth.py +++ b/server/src/wled_controller/api/auth.py @@ -59,7 +59,7 @@ def verify_api_key( break if not authenticated_as: - logger.warning(f"Invalid API key attempt: {token[:8]}...") + logger.warning("Invalid API key attempt") raise HTTPException( status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid API key", diff --git a/server/src/wled_controller/api/routes/_test_helpers.py b/server/src/wled_controller/api/routes/_test_helpers.py index 8a55f60..2803e03 100644 --- a/server/src/wled_controller/api/routes/_test_helpers.py +++ b/server/src/wled_controller/api/routes/_test_helpers.py @@ -129,7 +129,7 @@ async def stream_capture_test( done_event.set() # Start capture in background thread - loop = asyncio.get_event_loop() + loop = asyncio.get_running_loop() capture_future = loop.run_in_executor(None, _capture_loop) start_time = time.perf_counter() @@ -142,6 +142,8 @@ async def stream_capture_test( # Check for init error if init_error: + stop_event.set() + await capture_future await websocket.send_json({"type": "error", "detail": init_error}) return diff --git a/server/src/wled_controller/api/routes/audio_sources.py b/server/src/wled_controller/api/routes/audio_sources.py index cfba785..0a37cf0 100644 --- a/server/src/wled_controller/api/routes/audio_sources.py +++ b/server/src/wled_controller/api/routes/audio_sources.py @@ -125,7 +125,7 @@ async def update_audio_source( raise HTTPException(status_code=400, detail=str(e)) -@router.delete("/api/v1/audio-sources/{source_id}", tags=["Audio Sources"]) +@router.delete("/api/v1/audio-sources/{source_id}", status_code=204, tags=["Audio Sources"]) async def delete_audio_source( source_id: str, _auth: AuthRequired, @@ -143,7 +143,6 @@ async def delete_audio_source( ) store.delete_source(source_id) - return {"status": "deleted", "id": source_id} except ValueError as e: raise HTTPException(status_code=400, detail=str(e)) diff --git a/server/src/wled_controller/api/routes/color_strip_sources.py b/server/src/wled_controller/api/routes/color_strip_sources.py index 5843d65..15964a9 100644 --- a/server/src/wled_controller/api/routes/color_strip_sources.py +++ b/server/src/wled_controller/api/routes/color_strip_sources.py @@ -103,7 +103,7 @@ def _resolve_display_index(picture_source_id: str, picture_source_store: Picture if not picture_source_id or depth > 5: return 0 try: - ps = picture_source_store.get_source(picture_source_id) + ps = picture_source_store.get_stream(picture_source_id) except Exception: return 0 if isinstance(ps, ScreenCapturePictureSource): diff --git a/server/src/wled_controller/api/routes/value_sources.py b/server/src/wled_controller/api/routes/value_sources.py index 6b54260..9702a30 100644 --- a/server/src/wled_controller/api/routes/value_sources.py +++ b/server/src/wled_controller/api/routes/value_sources.py @@ -152,7 +152,7 @@ async def update_value_source( raise HTTPException(status_code=400, detail=str(e)) -@router.delete("/api/v1/value-sources/{source_id}", tags=["Value Sources"]) +@router.delete("/api/v1/value-sources/{source_id}", status_code=204, tags=["Value Sources"]) async def delete_value_source( source_id: str, _auth: AuthRequired, @@ -171,7 +171,6 @@ async def delete_value_source( ) store.delete_source(source_id) - return {"status": "deleted", "id": source_id} except ValueError as e: raise HTTPException(status_code=400, detail=str(e)) diff --git a/server/src/wled_controller/core/audio/audio_capture.py b/server/src/wled_controller/core/audio/audio_capture.py index 92d347e..fb278b5 100644 --- a/server/src/wled_controller/core/audio/audio_capture.py +++ b/server/src/wled_controller/core/audio/audio_capture.py @@ -222,6 +222,7 @@ class AudioCaptureManager: return key = (engine_type, device_index, is_loopback) + stream_to_stop = None with self._lock: if key not in self._streams: logger.warning(f"Attempted to release unknown audio capture: {key}") @@ -230,23 +231,28 @@ class AudioCaptureManager: stream, ref_count = self._streams[key] ref_count -= 1 if ref_count <= 0: - stream.stop() + stream_to_stop = stream del self._streams[key] logger.info(f"Removed audio capture {key}") else: self._streams[key] = (stream, ref_count) logger.debug(f"Released audio capture {key} (ref_count={ref_count})") + # Stop outside the lock — stream.stop() joins a thread (up to 5s) + if stream_to_stop is not None: + stream_to_stop.stop() def release_all(self) -> None: """Stop and remove all capture streams. Called on shutdown.""" with self._lock: - for key, (stream, _) in list(self._streams.items()): - try: - stream.stop() - except Exception as e: - logger.error(f"Error stopping audio capture {key}: {e}") + streams_to_stop = list(self._streams.items()) self._streams.clear() - logger.info("Released all audio capture streams") + # Stop outside the lock — each stop() joins a thread + for key, (stream, _) in streams_to_stop: + try: + stream.stop() + except Exception as e: + logger.error(f"Error stopping audio capture {key}: {e}") + logger.info("Released all audio capture streams") @staticmethod def enumerate_devices() -> List[dict]: diff --git a/server/src/wled_controller/core/devices/ddp_client.py b/server/src/wled_controller/core/devices/ddp_client.py index 535bc7c..58e71bd 100644 --- a/server/src/wled_controller/core/devices/ddp_client.py +++ b/server/src/wled_controller/core/devices/ddp_client.py @@ -193,12 +193,13 @@ class DDPClient: try: # Send plain RGB — WLED handles per-bus color order conversion # internally when outputting to hardware. + # Convert to numpy to avoid per-pixel Python loop bpp = 4 if self.rgbw else 3 # bytes per pixel - pixel_bytes = bytearray() - for r, g, b in pixels: - pixel_bytes.extend((int(r), int(g), int(b))) - if self.rgbw: - pixel_bytes.append(0) # White channel = 0 + pixel_array = np.array(pixels, dtype=np.uint8) + if self.rgbw: + white = np.zeros((pixel_array.shape[0], 1), dtype=np.uint8) + pixel_array = np.hstack((pixel_array, white)) + pixel_bytes = pixel_array.tobytes() total_bytes = len(pixel_bytes) # Align payload to full pixels (multiple of bpp) to avoid splitting diff --git a/server/src/wled_controller/core/filters/pixelate.py b/server/src/wled_controller/core/filters/pixelate.py index 3859530..5668c24 100644 --- a/server/src/wled_controller/core/filters/pixelate.py +++ b/server/src/wled_controller/core/filters/pixelate.py @@ -2,6 +2,7 @@ from typing import Any, Dict, List, Optional +import cv2 import numpy as np from wled_controller.core.filters.base import FilterOptionDef, PostprocessingFilter @@ -37,12 +38,12 @@ class PixelateFilter(PostprocessingFilter): h, w = image.shape[:2] - for y in range(0, h, block_size): - for x in range(0, w, block_size): - y_end = min(y + block_size, h) - x_end = min(x + block_size, w) - block = image[y:y_end, x:x_end] - mean_color = block.mean(axis=(0, 1)).astype(np.uint8) - image[y:y_end, x:x_end] = mean_color + # Resize down (area averaging) then up (nearest neighbor) — + # vectorized C++ instead of per-block Python loop + small_w = max(1, w // block_size) + small_h = max(1, h // block_size) + small = cv2.resize(image, (small_w, small_h), interpolation=cv2.INTER_AREA) + pixelated = cv2.resize(small, (w, h), interpolation=cv2.INTER_NEAREST) + np.copyto(image, pixelated) return None diff --git a/server/src/wled_controller/core/processing/audio_stream.py b/server/src/wled_controller/core/processing/audio_stream.py index 2ff0fb8..08f8d69 100644 --- a/server/src/wled_controller/core/processing/audio_stream.py +++ b/server/src/wled_controller/core/processing/audio_stream.py @@ -229,63 +229,71 @@ class AudioColorStripStream(ColorStripStream): "vu_meter": self._render_vu_meter, } - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps - n = self._led_count + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps + try: + n = self._led_count - # Rebuild scratch buffers and pre-computed arrays when LED count changes - if n != _pool_n: - _pool_n = n - _buf_a = np.zeros((n, 3), dtype=np.uint8) - _buf_b = np.zeros((n, 3), dtype=np.uint8) - _band_x = np.arange(NUM_BANDS, dtype=np.float32) - half = (n + 1) // 2 - _led_x_mirror = np.linspace(0, NUM_BANDS - 1, half) - _led_x = np.linspace(0, NUM_BANDS - 1, n) - _full_amp = np.empty(n, dtype=np.float32) - _vu_gradient = np.linspace(0, 1, n, dtype=np.float32) - _indices_buf = np.empty(n, dtype=np.int32) - self._prev_spectrum = None # reset smoothing on resize + # Rebuild scratch buffers and pre-computed arrays when LED count changes + if n != _pool_n: + _pool_n = n + _buf_a = np.zeros((n, 3), dtype=np.uint8) + _buf_b = np.zeros((n, 3), dtype=np.uint8) + _band_x = np.arange(NUM_BANDS, dtype=np.float32) + half = (n + 1) // 2 + _led_x_mirror = np.linspace(0, NUM_BANDS - 1, half) + _led_x = np.linspace(0, NUM_BANDS - 1, n) + _full_amp = np.empty(n, dtype=np.float32) + _vu_gradient = np.linspace(0, 1, n, dtype=np.float32) + _indices_buf = np.empty(n, dtype=np.int32) + self._prev_spectrum = None # reset smoothing on resize - # Make pre-computed arrays available to render methods - self._band_x = _band_x - self._led_x = _led_x - self._led_x_mirror = _led_x_mirror - self._full_amp = _full_amp - self._vu_gradient = _vu_gradient - self._indices_buf = _indices_buf + # Make pre-computed arrays available to render methods + self._band_x = _band_x + self._led_x = _led_x + self._led_x_mirror = _led_x_mirror + self._full_amp = _full_amp + self._vu_gradient = _vu_gradient + self._indices_buf = _indices_buf - buf = _buf_a if _use_a else _buf_b - _use_a = not _use_a + buf = _buf_a if _use_a else _buf_b + _use_a = not _use_a - # Get latest audio analysis - analysis = None - if self._audio_stream is not None: - analysis = self._audio_stream.get_latest_analysis() + # Get latest audio analysis + analysis = None + if self._audio_stream is not None: + analysis = self._audio_stream.get_latest_analysis() - render_fn = renderers.get(self._visualization_mode, self._render_spectrum) - t_render = time.perf_counter() - render_fn(buf, n, analysis) - render_ms = (time.perf_counter() - t_render) * 1000 + render_fn = renderers.get(self._visualization_mode, self._render_spectrum) + t_render = time.perf_counter() + render_fn(buf, n, analysis) + render_ms = (time.perf_counter() - t_render) * 1000 - with self._colors_lock: - self._colors = buf + with self._colors_lock: + self._colors = buf - # Pull capture-side timing and combine with render timing - capture_timing = self._audio_stream.get_last_timing() if self._audio_stream else {} - read_ms = capture_timing.get("read_ms", 0) - fft_ms = capture_timing.get("fft_ms", 0) - self._last_timing = { - "audio_read_ms": read_ms, - "audio_fft_ms": fft_ms, - "audio_render_ms": render_ms, - "total_ms": read_ms + fft_ms + render_ms, - } + # Pull capture-side timing and combine with render timing + capture_timing = self._audio_stream.get_last_timing() if self._audio_stream else {} + read_ms = capture_timing.get("read_ms", 0) + fft_ms = capture_timing.get("fft_ms", 0) + self._last_timing = { + "audio_read_ms": read_ms, + "audio_fft_ms": fft_ms, + "audio_render_ms": render_ms, + "total_ms": read_ms + fft_ms + render_ms, + } + except Exception as e: + logger.error(f"AudioColorStripStream render error: {e}") - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal AudioColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False # ── Channel selection ───────────────────────────────────────── diff --git a/server/src/wled_controller/core/processing/color_strip_stream.py b/server/src/wled_controller/core/processing/color_strip_stream.py index c8dc021..7bca596 100644 --- a/server/src/wled_controller/core/processing/color_strip_stream.py +++ b/server/src/wled_controller/core/processing/color_strip_stream.py @@ -334,145 +334,150 @@ class PictureColorStripStream(ColorStripStream): led_colors = frame_buf return led_colors - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - fps = self._fps - frame_time = 1.0 / fps if fps > 0 else 1.0 + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + fps = self._fps + frame_time = 1.0 / fps if fps > 0 else 1.0 - try: - frame = self._live_stream.get_latest_frame() + try: + frame = self._live_stream.get_latest_frame() - if frame is None or frame is cached_frame: + if frame is None or frame is cached_frame: + if ( + frame is not None + and self._frame_interpolation + and self._interp_from is not None + and self._interp_to is not None + and _u16_a is not None + ): + # Interpolate between previous and current capture + t = min(1.0, (loop_start - self._interp_start) / self._interp_duration) + frame_buf = _frame_a if _use_a else _frame_b + _use_a = not _use_a + _blend_u16(self._interp_from, self._interp_to, int(t * 256), frame_buf) + led_colors = _apply_corrections(frame_buf, frame_buf) + with self._colors_lock: + self._latest_colors = led_colors + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + continue + + interval = ( + loop_start - self._last_capture_time + if self._last_capture_time > 0 + else frame_time + ) + self._last_capture_time = loop_start + cached_frame = frame + + t0 = time.perf_counter() + + calibration = self._calibration + border_pixels = extract_border_pixels(frame, calibration.border_width) + t1 = time.perf_counter() + + led_colors = self._pixel_mapper.map_border_to_leds(border_pixels) + t2 = time.perf_counter() + + # Ensure scratch pool is sized for this frame + target_count = self._led_count + _n = target_count if target_count > 0 else len(led_colors) + if _n > 0 and _n != _pool_n: + _pool_n = _n + _frame_a = np.empty((_n, 3), dtype=np.uint8) + _frame_b = np.empty((_n, 3), dtype=np.uint8) + _u16_a = np.empty((_n, 3), dtype=np.uint16) + _u16_b = np.empty((_n, 3), dtype=np.uint16) + _i32 = np.empty((_n, 3), dtype=np.int32) + _i32_gray = np.empty((_n, 1), dtype=np.int32) + self._previous_colors = None + + # Copy/pad into double-buffered frame (avoids per-frame allocations) + frame_buf = _frame_a if _use_a else _frame_b + _use_a = not _use_a + n_leds = len(led_colors) + if _pool_n > 0: + if n_leds < _pool_n: + frame_buf[:n_leds] = led_colors + frame_buf[n_leds:] = 0 + elif n_leds > _pool_n: + frame_buf[:] = led_colors[:_pool_n] + else: + frame_buf[:] = led_colors + led_colors = frame_buf + + # Temporal smoothing (pre-allocated uint16 scratch) + smoothing = self._smoothing if ( - frame is not None - and self._frame_interpolation - and self._interp_from is not None - and self._interp_to is not None + self._previous_colors is not None + and smoothing > 0 + and len(self._previous_colors) == len(led_colors) and _u16_a is not None ): - # Interpolate between previous and current capture - t = min(1.0, (loop_start - self._interp_start) / self._interp_duration) - frame_buf = _frame_a if _use_a else _frame_b - _use_a = not _use_a - _blend_u16(self._interp_from, self._interp_to, int(t * 256), frame_buf) - led_colors = _apply_corrections(frame_buf, frame_buf) - with self._colors_lock: - self._latest_colors = led_colors - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) - continue + _blend_u16(led_colors, self._previous_colors, + int(smoothing * 256), led_colors) + t3 = time.perf_counter() - interval = ( - loop_start - self._last_capture_time - if self._last_capture_time > 0 - else frame_time - ) - self._last_capture_time = loop_start - cached_frame = frame + # Update interpolation buffers (smoothed colors, before corrections) + # Must be AFTER smoothing so idle-tick interpolation produces + # output consistent with new-frame ticks (both smoothed). + if self._frame_interpolation: + self._interp_from = self._interp_to + self._interp_to = led_colors.copy() + self._interp_start = loop_start + self._interp_duration = max(interval, 0.001) - t0 = time.perf_counter() + # Saturation (pre-allocated int32 scratch) + saturation = self._saturation + if saturation != 1.0: + _apply_saturation(led_colors, saturation, _i32, _i32_gray, led_colors) + t4 = time.perf_counter() - calibration = self._calibration - border_pixels = extract_border_pixels(frame, calibration.border_width) - t1 = time.perf_counter() + # Gamma (LUT lookup — O(1) per pixel) + if self._gamma != 1.0: + led_colors = self._gamma_lut[led_colors] + t5 = time.perf_counter() - led_colors = self._pixel_mapper.map_border_to_leds(border_pixels) - t2 = time.perf_counter() + # Brightness (integer math with pre-allocated int32 scratch) + brightness = self._brightness + if brightness != 1.0: + bright_int = int(brightness * 256) + np.copyto(_i32, led_colors, casting='unsafe') + _i32 *= bright_int + _i32 >>= 8 + np.clip(_i32, 0, 255, out=_i32) + np.copyto(frame_buf, _i32, casting='unsafe') + led_colors = frame_buf + t6 = time.perf_counter() - # Ensure scratch pool is sized for this frame - target_count = self._led_count - _n = target_count if target_count > 0 else len(led_colors) - if _n > 0 and _n != _pool_n: - _pool_n = _n - _frame_a = np.empty((_n, 3), dtype=np.uint8) - _frame_b = np.empty((_n, 3), dtype=np.uint8) - _u16_a = np.empty((_n, 3), dtype=np.uint16) - _u16_b = np.empty((_n, 3), dtype=np.uint16) - _i32 = np.empty((_n, 3), dtype=np.int32) - _i32_gray = np.empty((_n, 1), dtype=np.int32) - self._previous_colors = None + self._previous_colors = led_colors - # Copy/pad into double-buffered frame (avoids per-frame allocations) - frame_buf = _frame_a if _use_a else _frame_b - _use_a = not _use_a - n_leds = len(led_colors) - if _pool_n > 0: - if n_leds < _pool_n: - frame_buf[:n_leds] = led_colors - frame_buf[n_leds:] = 0 - elif n_leds > _pool_n: - frame_buf[:] = led_colors[:_pool_n] - else: - frame_buf[:] = led_colors - led_colors = frame_buf + with self._colors_lock: + self._latest_colors = led_colors - # Temporal smoothing (pre-allocated uint16 scratch) - smoothing = self._smoothing - if ( - self._previous_colors is not None - and smoothing > 0 - and len(self._previous_colors) == len(led_colors) - and _u16_a is not None - ): - _blend_u16(led_colors, self._previous_colors, - int(smoothing * 256), led_colors) - t3 = time.perf_counter() + self._last_timing = { + "extract_ms": (t1 - t0) * 1000, + "map_leds_ms": (t2 - t1) * 1000, + "smooth_ms": (t3 - t2) * 1000, + "saturation_ms": (t4 - t3) * 1000, + "gamma_ms": (t5 - t4) * 1000, + "brightness_ms": (t6 - t5) * 1000, + "total_ms": (t6 - t0) * 1000, + } - # Update interpolation buffers (smoothed colors, before corrections) - # Must be AFTER smoothing so idle-tick interpolation produces - # output consistent with new-frame ticks (both smoothed). - if self._frame_interpolation: - self._interp_from = self._interp_to - self._interp_to = led_colors.copy() - self._interp_start = loop_start - self._interp_duration = max(interval, 0.001) + except Exception as e: + logger.error(f"PictureColorStripStream processing error: {e}", exc_info=True) - # Saturation (pre-allocated int32 scratch) - saturation = self._saturation - if saturation != 1.0: - _apply_saturation(led_colors, saturation, _i32, _i32_gray, led_colors) - t4 = time.perf_counter() - - # Gamma (LUT lookup — O(1) per pixel) - if self._gamma != 1.0: - led_colors = self._gamma_lut[led_colors] - t5 = time.perf_counter() - - # Brightness (integer math with pre-allocated int32 scratch) - brightness = self._brightness - if brightness != 1.0: - bright_int = int(brightness * 256) - np.copyto(_i32, led_colors, casting='unsafe') - _i32 *= bright_int - _i32 >>= 8 - np.clip(_i32, 0, 255, out=_i32) - np.copyto(frame_buf, _i32, casting='unsafe') - led_colors = frame_buf - t6 = time.perf_counter() - - self._previous_colors = led_colors - - with self._colors_lock: - self._latest_colors = led_colors - - self._last_timing = { - "extract_ms": (t1 - t0) * 1000, - "map_leds_ms": (t2 - t1) * 1000, - "smooth_ms": (t3 - t2) * 1000, - "saturation_ms": (t4 - t3) * 1000, - "gamma_ms": (t5 - t4) * 1000, - "brightness_ms": (t6 - t5) * 1000, - "total_ms": (t6 - t0) * 1000, - } - - except Exception as e: - logger.error(f"PictureColorStripStream processing error: {e}", exc_info=True) - - elapsed = time.perf_counter() - loop_start - remaining = frame_time - elapsed - if remaining > 0: - time.sleep(remaining) + elapsed = time.perf_counter() - loop_start + remaining = frame_time - elapsed + if remaining > 0: + time.sleep(remaining) + except Exception as e: + logger.error(f"Fatal PictureColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False def _compute_gradient_colors(stops: list, led_count: int) -> np.ndarray: @@ -506,30 +511,42 @@ def _compute_gradient_colors(stops: list, led_count: int) -> np.ndarray: c = stop.get("color", [255, 255, 255]) return np.array(c if isinstance(c, list) and len(c) == 3 else [255, 255, 255], dtype=np.float32) + # Vectorized: compute all LED positions at once + positions = np.linspace(0, 1, led_count) if led_count > 1 else np.array([0.0]) result = np.zeros((led_count, 3), dtype=np.float32) - for i in range(led_count): - p = i / (led_count - 1) if led_count > 1 else 0.0 + # Extract stop positions and colors into arrays + n_stops = len(sorted_stops) + stop_positions = np.array([float(s.get("position", 0)) for s in sorted_stops], dtype=np.float32) - if p <= float(sorted_stops[0].get("position", 0)): - result[i] = _color(sorted_stops[0], "left") - continue + # Pre-compute left/right colors for each stop + left_colors = np.array([_color(s, "left") for s in sorted_stops], dtype=np.float32) + right_colors = np.array([_color(s, "right") for s in sorted_stops], dtype=np.float32) - last = sorted_stops[-1] - if p >= float(last.get("position", 1)): - result[i] = _color(last, "right") - continue + # LEDs before first stop + mask_before = positions <= stop_positions[0] + result[mask_before] = left_colors[0] - for j in range(len(sorted_stops) - 1): - a = sorted_stops[j] - b = sorted_stops[j + 1] - a_pos = float(a.get("position", 0)) - b_pos = float(b.get("position", 1)) - if a_pos <= p <= b_pos: - span = b_pos - a_pos - t = (p - a_pos) / span if span > 0 else 0.0 - result[i] = _color(a, "right") + t * (_color(b, "left") - _color(a, "right")) - break + # LEDs after last stop + mask_after = positions >= stop_positions[-1] + result[mask_after] = right_colors[-1] + + # LEDs between stops — vectorized per segment + mask_between = ~mask_before & ~mask_after + if np.any(mask_between): + between_pos = positions[mask_between] + # np.searchsorted finds the right stop index for each LED + idx = np.searchsorted(stop_positions, between_pos, side="right") - 1 + idx = np.clip(idx, 0, n_stops - 2) + + a_pos = stop_positions[idx] + b_pos = stop_positions[idx + 1] + span = b_pos - a_pos + t = np.where(span > 0, (between_pos - a_pos) / span, 0.0) + + a_colors = right_colors[idx] # A's right color + b_colors = left_colors[idx + 1] # B's left color + result[mask_between] = a_colors + t[:, np.newaxis] * (b_colors - a_colors) return np.clip(result, 0, 255).astype(np.uint8) @@ -646,90 +663,98 @@ class StaticColorStripStream(ColorStripStream): _buf_a = _buf_b = None _use_a = True - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps - anim = self._animation - if anim and anim.get("enabled"): - speed = float(anim.get("speed", 1.0)) - atype = anim.get("type", "breathing") - t = loop_start - n = self._led_count + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps + try: + anim = self._animation + if anim and anim.get("enabled"): + speed = float(anim.get("speed", 1.0)) + atype = anim.get("type", "breathing") + t = loop_start + n = self._led_count - if n != _pool_n: - _pool_n = n - _buf_a = np.empty((n, 3), dtype=np.uint8) - _buf_b = np.empty((n, 3), dtype=np.uint8) + if n != _pool_n: + _pool_n = n + _buf_a = np.empty((n, 3), dtype=np.uint8) + _buf_b = np.empty((n, 3), dtype=np.uint8) - buf = _buf_a if _use_a else _buf_b - _use_a = not _use_a - colors = None + buf = _buf_a if _use_a else _buf_b + _use_a = not _use_a + colors = None - if atype == "breathing": - factor = 0.5 * (1 + math.sin(2 * math.pi * speed * t * 0.5)) - r, g, b = self._source_color - buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) - colors = buf + if atype == "breathing": + factor = 0.5 * (1 + math.sin(2 * math.pi * speed * t * 0.5)) + r, g, b = self._source_color + buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) + colors = buf - elif atype == "strobe": - # Square wave: on for half the period, off for the other half. - # speed=1.0 → 2 flashes/sec (one full on/off cycle per 0.5s) - if math.sin(2 * math.pi * speed * t * 2.0) >= 0: - buf[:] = self._source_color - else: - buf[:] = 0 - colors = buf + elif atype == "strobe": + # Square wave: on for half the period, off for the other half. + # speed=1.0 → 2 flashes/sec (one full on/off cycle per 0.5s) + if math.sin(2 * math.pi * speed * t * 2.0) >= 0: + buf[:] = self._source_color + else: + buf[:] = 0 + colors = buf - elif atype == "sparkle": - # Random LEDs flash white while the rest stay the base color - buf[:] = self._source_color - density = min(0.5, 0.1 * speed) - mask = np.random.random(n) < density - buf[mask] = (255, 255, 255) - colors = buf + elif atype == "sparkle": + # Random LEDs flash white while the rest stay the base color + buf[:] = self._source_color + density = min(0.5, 0.1 * speed) + mask = np.random.random(n) < density + buf[mask] = (255, 255, 255) + colors = buf - elif atype == "pulse": - # Sharp attack, slow exponential decay — heartbeat-like - # speed=1.0 → ~1 pulse per second - phase = (speed * t * 1.0) % 1.0 - if phase < 0.1: - factor = phase / 0.1 - else: - factor = math.exp(-5.0 * (phase - 0.1)) - r, g, b = self._source_color - buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) - colors = buf + elif atype == "pulse": + # Sharp attack, slow exponential decay — heartbeat-like + # speed=1.0 → ~1 pulse per second + phase = (speed * t * 1.0) % 1.0 + if phase < 0.1: + factor = phase / 0.1 + else: + factor = math.exp(-5.0 * (phase - 0.1)) + r, g, b = self._source_color + buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) + colors = buf - elif atype == "candle": - # Random brightness fluctuations simulating a candle flame - base_factor = 0.75 - flicker = 0.25 * math.sin(2 * math.pi * speed * t * 3.7) - flicker += 0.15 * math.sin(2 * math.pi * speed * t * 7.3) - flicker += 0.10 * (np.random.random() - 0.5) - factor = max(0.2, min(1.0, base_factor + flicker)) - r, g, b = self._source_color - buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) - colors = buf + elif atype == "candle": + # Random brightness fluctuations simulating a candle flame + base_factor = 0.75 + flicker = 0.25 * math.sin(2 * math.pi * speed * t * 3.7) + flicker += 0.15 * math.sin(2 * math.pi * speed * t * 7.3) + flicker += 0.10 * (np.random.random() - 0.5) + factor = max(0.2, min(1.0, base_factor + flicker)) + r, g, b = self._source_color + buf[:] = (min(255, int(r * factor)), min(255, int(g * factor)), min(255, int(b * factor))) + colors = buf - elif atype == "rainbow_fade": - # Shift hue continuously from the base color - r, g, b = self._source_color - h, s, v = colorsys.rgb_to_hsv(r / 255.0, g / 255.0, b / 255.0) - # speed=1.0 → one full hue rotation every ~10s - h_shift = (speed * t * 0.1) % 1.0 - new_h = (h + h_shift) % 1.0 - nr, ng, nb = colorsys.hsv_to_rgb(new_h, max(s, 0.5), max(v, 0.3)) - buf[:] = (int(nr * 255), int(ng * 255), int(nb * 255)) - colors = buf + elif atype == "rainbow_fade": + # Shift hue continuously from the base color + r, g, b = self._source_color + h, s, v = colorsys.rgb_to_hsv(r / 255.0, g / 255.0, b / 255.0) + # speed=1.0 → one full hue rotation every ~10s + h_shift = (speed * t * 0.1) % 1.0 + new_h = (h + h_shift) % 1.0 + nr, ng, nb = colorsys.hsv_to_rgb(new_h, max(s, 0.5), max(v, 0.3)) + buf[:] = (int(nr * 255), int(ng * 255), int(nb * 255)) + colors = buf - if colors is not None: - with self._colors_lock: - self._colors = colors + if colors is not None: + with self._colors_lock: + self._colors = colors + except Exception as e: + logger.error(f"StaticColorStripStream animation error: {e}") - elapsed = time.perf_counter() - loop_start - sleep_target = frame_time if anim and anim.get("enabled") else 0.25 - time.sleep(max(sleep_target - elapsed, 0.001)) + elapsed = time.perf_counter() - loop_start + sleep_target = frame_time if anim and anim.get("enabled") else 0.25 + time.sleep(max(sleep_target - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal StaticColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False class ColorCycleColorStripStream(ColorStripStream): @@ -834,39 +859,47 @@ class ColorCycleColorStripStream(ColorStripStream): _buf_a = _buf_b = None _use_a = True - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps - color_list = self._color_list - speed = self._cycle_speed - n = self._led_count - num = len(color_list) - if num >= 2: - if n != _pool_n: - _pool_n = n - _buf_a = np.empty((n, 3), dtype=np.uint8) - _buf_b = np.empty((n, 3), dtype=np.uint8) + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps + try: + color_list = self._color_list + speed = self._cycle_speed + n = self._led_count + num = len(color_list) + if num >= 2: + if n != _pool_n: + _pool_n = n + _buf_a = np.empty((n, 3), dtype=np.uint8) + _buf_b = np.empty((n, 3), dtype=np.uint8) - buf = _buf_a if _use_a else _buf_b - _use_a = not _use_a + buf = _buf_a if _use_a else _buf_b + _use_a = not _use_a - # 0.05 factor → one full cycle every 20s at speed=1.0 - cycle_pos = (speed * loop_start * 0.05) % 1.0 - seg = cycle_pos * num - idx = int(seg) % num - t_i = seg - int(seg) - c1 = color_list[idx] - c2 = color_list[(idx + 1) % num] - buf[:] = ( - min(255, int(c1[0] + (c2[0] - c1[0]) * t_i)), - min(255, int(c1[1] + (c2[1] - c1[1]) * t_i)), - min(255, int(c1[2] + (c2[2] - c1[2]) * t_i)), - ) - with self._colors_lock: - self._colors = buf - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) + # 0.05 factor → one full cycle every 20s at speed=1.0 + cycle_pos = (speed * loop_start * 0.05) % 1.0 + seg = cycle_pos * num + idx = int(seg) % num + t_i = seg - int(seg) + c1 = color_list[idx] + c2 = color_list[(idx + 1) % num] + buf[:] = ( + min(255, int(c1[0] + (c2[0] - c1[0]) * t_i)), + min(255, int(c1[1] + (c2[1] - c1[1]) * t_i)), + min(255, int(c1[2] + (c2[2] - c1[2]) * t_i)), + ) + with self._colors_lock: + self._colors = buf + except Exception as e: + logger.error(f"ColorCycleColorStripStream animation error: {e}") + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal ColorCycleColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False class GradientColorStripStream(ColorStripStream): @@ -986,130 +1019,138 @@ class GradientColorStripStream(ColorStripStream): _wave_factors = None # float32 scratch for wave sin result _wave_u16 = None # uint16 scratch for wave int factors - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps - anim = self._animation - if anim and anim.get("enabled"): - speed = float(anim.get("speed", 1.0)) - atype = anim.get("type", "breathing") - t = loop_start - n = self._led_count - stops = self._stops - colors = None + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps + try: + anim = self._animation + if anim and anim.get("enabled"): + speed = float(anim.get("speed", 1.0)) + atype = anim.get("type", "breathing") + t = loop_start + n = self._led_count + stops = self._stops + colors = None - # Recompute base gradient only when stops or led_count change - if _cached_base is None or _cached_n != n or _cached_stops is not stops: - _cached_base = _compute_gradient_colors(stops, n) - _cached_n = n - _cached_stops = stops - base = _cached_base + # Recompute base gradient only when stops or led_count change + if _cached_base is None or _cached_n != n or _cached_stops is not stops: + _cached_base = _compute_gradient_colors(stops, n) + _cached_n = n + _cached_stops = stops + base = _cached_base - # Re-allocate pool only when LED count changes - if n != _pool_n: - _pool_n = n - _buf_a = np.empty((n, 3), dtype=np.uint8) - _buf_b = np.empty((n, 3), dtype=np.uint8) - _scratch_u16 = np.empty((n, 3), dtype=np.uint16) - _wave_i = np.arange(n, dtype=np.float32) - _wave_factors = np.empty(n, dtype=np.float32) - _wave_u16 = np.empty(n, dtype=np.uint16) + # Re-allocate pool only when LED count changes + if n != _pool_n: + _pool_n = n + _buf_a = np.empty((n, 3), dtype=np.uint8) + _buf_b = np.empty((n, 3), dtype=np.uint8) + _scratch_u16 = np.empty((n, 3), dtype=np.uint16) + _wave_i = np.arange(n, dtype=np.float32) + _wave_factors = np.empty(n, dtype=np.float32) + _wave_u16 = np.empty(n, dtype=np.uint16) - buf = _buf_a if _use_a else _buf_b - _use_a = not _use_a + buf = _buf_a if _use_a else _buf_b + _use_a = not _use_a - if atype == "breathing": - int_f = max(0, min(256, int(0.5 * (1 + math.sin(2 * math.pi * speed * t * 0.5)) * 256))) - np.copyto(_scratch_u16, base) - _scratch_u16 *= int_f - _scratch_u16 >>= 8 - np.copyto(buf, _scratch_u16, casting='unsafe') - colors = buf + if atype == "breathing": + int_f = max(0, min(256, int(0.5 * (1 + math.sin(2 * math.pi * speed * t * 0.5)) * 256))) + np.copyto(_scratch_u16, base) + _scratch_u16 *= int_f + _scratch_u16 >>= 8 + np.copyto(buf, _scratch_u16, casting='unsafe') + colors = buf - elif atype == "gradient_shift": - shift = int(speed * t * 10) % max(n, 1) - if shift > 0: - buf[:n - shift] = base[shift:] - buf[n - shift:] = base[:shift] - else: - np.copyto(buf, base) - colors = buf + elif atype == "gradient_shift": + shift = int(speed * t * 10) % max(n, 1) + if shift > 0: + buf[:n - shift] = base[shift:] + buf[n - shift:] = base[:shift] + else: + np.copyto(buf, base) + colors = buf - elif atype == "wave": - if n > 1: - np.sin( - 2 * math.pi * _wave_i / n - 2 * math.pi * speed * t * 0.25, - out=_wave_factors, - ) - _wave_factors *= 0.5 - _wave_factors += 0.5 - np.multiply(_wave_factors, 256, out=_wave_factors) - np.clip(_wave_factors, 0, 256, out=_wave_factors) - np.copyto(_wave_u16, _wave_factors, casting='unsafe') - np.copyto(_scratch_u16, base) - _scratch_u16 *= _wave_u16[:, None] - _scratch_u16 >>= 8 - np.copyto(buf, _scratch_u16, casting='unsafe') - colors = buf - else: - np.copyto(buf, base) - colors = buf + elif atype == "wave": + if n > 1: + np.sin( + 2 * math.pi * _wave_i / n - 2 * math.pi * speed * t * 0.25, + out=_wave_factors, + ) + _wave_factors *= 0.5 + _wave_factors += 0.5 + np.multiply(_wave_factors, 256, out=_wave_factors) + np.clip(_wave_factors, 0, 256, out=_wave_factors) + np.copyto(_wave_u16, _wave_factors, casting='unsafe') + np.copyto(_scratch_u16, base) + _scratch_u16 *= _wave_u16[:, None] + _scratch_u16 >>= 8 + np.copyto(buf, _scratch_u16, casting='unsafe') + colors = buf + else: + np.copyto(buf, base) + colors = buf - elif atype == "strobe": - if math.sin(2 * math.pi * speed * t * 2.0) >= 0: - np.copyto(buf, base) - else: - buf[:] = 0 - colors = buf + elif atype == "strobe": + if math.sin(2 * math.pi * speed * t * 2.0) >= 0: + np.copyto(buf, base) + else: + buf[:] = 0 + colors = buf - elif atype == "sparkle": - np.copyto(buf, base) - density = min(0.5, 0.1 * speed) - mask = np.random.random(n) < density - buf[mask] = (255, 255, 255) - colors = buf + elif atype == "sparkle": + np.copyto(buf, base) + density = min(0.5, 0.1 * speed) + mask = np.random.random(n) < density + buf[mask] = (255, 255, 255) + colors = buf - elif atype == "pulse": - phase = (speed * t * 1.0) % 1.0 - if phase < 0.1: - factor = phase / 0.1 - else: - factor = math.exp(-5.0 * (phase - 0.1)) - int_f = max(0, min(256, int(factor * 256))) - np.copyto(_scratch_u16, base) - _scratch_u16 *= int_f - _scratch_u16 >>= 8 - np.copyto(buf, _scratch_u16, casting='unsafe') - colors = buf + elif atype == "pulse": + phase = (speed * t * 1.0) % 1.0 + if phase < 0.1: + factor = phase / 0.1 + else: + factor = math.exp(-5.0 * (phase - 0.1)) + int_f = max(0, min(256, int(factor * 256))) + np.copyto(_scratch_u16, base) + _scratch_u16 *= int_f + _scratch_u16 >>= 8 + np.copyto(buf, _scratch_u16, casting='unsafe') + colors = buf - elif atype == "candle": - base_factor = 0.75 - flicker = 0.25 * math.sin(2 * math.pi * speed * t * 3.7) - flicker += 0.15 * math.sin(2 * math.pi * speed * t * 7.3) - flicker += 0.10 * (np.random.random() - 0.5) - factor = max(0.2, min(1.0, base_factor + flicker)) - int_f = int(factor * 256) - np.copyto(_scratch_u16, base) - _scratch_u16 *= int_f - _scratch_u16 >>= 8 - np.copyto(buf, _scratch_u16, casting='unsafe') - colors = buf + elif atype == "candle": + base_factor = 0.75 + flicker = 0.25 * math.sin(2 * math.pi * speed * t * 3.7) + flicker += 0.15 * math.sin(2 * math.pi * speed * t * 7.3) + flicker += 0.10 * (np.random.random() - 0.5) + factor = max(0.2, min(1.0, base_factor + flicker)) + int_f = int(factor * 256) + np.copyto(_scratch_u16, base) + _scratch_u16 *= int_f + _scratch_u16 >>= 8 + np.copyto(buf, _scratch_u16, casting='unsafe') + colors = buf - elif atype == "rainbow_fade": - h_shift = (speed * t * 0.1) % 1.0 - for i in range(n): - r, g, b = base[i] - h, s, v = colorsys.rgb_to_hsv(r / 255.0, g / 255.0, b / 255.0) - new_h = (h + h_shift) % 1.0 - nr, ng, nb = colorsys.hsv_to_rgb(new_h, max(s, 0.5), max(v, 0.3)) - buf[i] = (int(nr * 255), int(ng * 255), int(nb * 255)) - colors = buf + elif atype == "rainbow_fade": + h_shift = (speed * t * 0.1) % 1.0 + for i in range(n): + r, g, b = base[i] + h, s, v = colorsys.rgb_to_hsv(r / 255.0, g / 255.0, b / 255.0) + new_h = (h + h_shift) % 1.0 + nr, ng, nb = colorsys.hsv_to_rgb(new_h, max(s, 0.5), max(v, 0.3)) + buf[i] = (int(nr * 255), int(ng * 255), int(nb * 255)) + colors = buf - if colors is not None: - with self._colors_lock: - self._colors = colors + if colors is not None: + with self._colors_lock: + self._colors = colors + except Exception as e: + logger.error(f"GradientColorStripStream animation error: {e}") - elapsed = time.perf_counter() - loop_start - sleep_target = frame_time if anim and anim.get("enabled") else 0.25 - time.sleep(max(sleep_target - elapsed, 0.001)) + elapsed = time.perf_counter() - loop_start + sleep_target = frame_time if anim and anim.get("enabled") else 0.25 + time.sleep(max(sleep_target - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal GradientColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False diff --git a/server/src/wled_controller/core/processing/composite_stream.py b/server/src/wled_controller/core/processing/composite_stream.py index 95e8d42..9205925 100644 --- a/server/src/wled_controller/core/processing/composite_stream.py +++ b/server/src/wled_controller/core/processing/composite_stream.py @@ -253,61 +253,66 @@ class CompositeColorStripStream(ColorStripStream): # ── Processing loop ───────────────────────────────────────── def _processing_loop(self) -> None: - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps + try: + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps - try: - target_n = self._led_count - if target_n <= 0: - time.sleep(frame_time) - continue - - self._ensure_pool(target_n) - - result_buf = self._result_a if self._use_a else self._result_b - self._use_a = not self._use_a - has_result = False - - for i, layer in enumerate(self._layers): - if not layer.get("enabled", True): - continue - if i not in self._sub_streams: + try: + target_n = self._led_count + if target_n <= 0: + time.sleep(frame_time) continue - _src_id, _consumer_id, stream = self._sub_streams[i] - colors = stream.get_latest_colors() - if colors is None: - continue + self._ensure_pool(target_n) - # Resize to target LED count if needed - if len(colors) != target_n: - colors = self._resize_to_target(colors, target_n) + result_buf = self._result_a if self._use_a else self._result_b + self._use_a = not self._use_a + has_result = False - opacity = layer.get("opacity", 1.0) - blend_mode = layer.get("blend_mode", _BLEND_NORMAL) - alpha = int(opacity * 256) - alpha = max(0, min(256, alpha)) + for i, layer in enumerate(self._layers): + if not layer.get("enabled", True): + continue + if i not in self._sub_streams: + continue - if not has_result: - # First layer: copy directly (or blend with black if opacity < 1) - if alpha >= 256 and blend_mode == _BLEND_NORMAL: - result_buf[:] = colors + _src_id, _consumer_id, stream = self._sub_streams[i] + colors = stream.get_latest_colors() + if colors is None: + continue + + # Resize to target LED count if needed + if len(colors) != target_n: + colors = self._resize_to_target(colors, target_n) + + opacity = layer.get("opacity", 1.0) + blend_mode = layer.get("blend_mode", _BLEND_NORMAL) + alpha = int(opacity * 256) + alpha = max(0, min(256, alpha)) + + if not has_result: + # First layer: copy directly (or blend with black if opacity < 1) + if alpha >= 256 and blend_mode == _BLEND_NORMAL: + result_buf[:] = colors + else: + result_buf[:] = 0 + blend_fn = getattr(self, self._BLEND_DISPATCH.get(blend_mode, "_blend_normal")) + blend_fn(result_buf, colors, alpha, result_buf) + has_result = True else: - result_buf[:] = 0 blend_fn = getattr(self, self._BLEND_DISPATCH.get(blend_mode, "_blend_normal")) blend_fn(result_buf, colors, alpha, result_buf) - has_result = True - else: - blend_fn = getattr(self, self._BLEND_DISPATCH.get(blend_mode, "_blend_normal")) - blend_fn(result_buf, colors, alpha, result_buf) - if has_result: - with self._colors_lock: - self._latest_colors = result_buf + if has_result: + with self._colors_lock: + self._latest_colors = result_buf - except Exception as e: - logger.error(f"CompositeColorStripStream processing error: {e}", exc_info=True) + except Exception as e: + logger.error(f"CompositeColorStripStream processing error: {e}", exc_info=True) - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal CompositeColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False diff --git a/server/src/wled_controller/core/processing/effect_stream.py b/server/src/wled_controller/core/processing/effect_stream.py index d4e4316..859c73e 100644 --- a/server/src/wled_controller/core/processing/effect_stream.py +++ b/server/src/wled_controller/core/processing/effect_stream.py @@ -284,38 +284,45 @@ class EffectColorStripStream(ColorStripStream): "aurora": self._render_aurora, } - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps + try: + n = self._led_count + if n != _pool_n: + _pool_n = n + _buf_a = np.empty((n, 3), dtype=np.uint8) + _buf_b = np.empty((n, 3), dtype=np.uint8) + # Scratch arrays for render methods + self._s_f32_a = np.empty(n, dtype=np.float32) + self._s_f32_b = np.empty(n, dtype=np.float32) + self._s_f32_c = np.empty(n, dtype=np.float32) + self._s_i32 = np.empty(n, dtype=np.int32) + self._s_f32_rgb = np.empty((n, 3), dtype=np.float32) + self._s_arange = np.arange(n, dtype=np.float32) + self._s_layer1 = np.empty(n, dtype=np.float32) + self._s_layer2 = np.empty(n, dtype=np.float32) + self._plasma_key = (0, 0.0) - n = self._led_count - if n != _pool_n: - _pool_n = n - _buf_a = np.empty((n, 3), dtype=np.uint8) - _buf_b = np.empty((n, 3), dtype=np.uint8) - # Scratch arrays for render methods - self._s_f32_a = np.empty(n, dtype=np.float32) - self._s_f32_b = np.empty(n, dtype=np.float32) - self._s_f32_c = np.empty(n, dtype=np.float32) - self._s_i32 = np.empty(n, dtype=np.int32) - self._s_f32_rgb = np.empty((n, 3), dtype=np.float32) - self._s_arange = np.arange(n, dtype=np.float32) - self._s_layer1 = np.empty(n, dtype=np.float32) - self._s_layer2 = np.empty(n, dtype=np.float32) - self._plasma_key = (0, 0.0) + buf = _buf_a if _use_a else _buf_b + _use_a = not _use_a - buf = _buf_a if _use_a else _buf_b - _use_a = not _use_a + render_fn = renderers.get(self._effect_type, self._render_fire) + render_fn(buf, n, loop_start) - render_fn = renderers.get(self._effect_type, self._render_fire) - render_fn(buf, n, loop_start) + with self._colors_lock: + self._colors = buf + except Exception as e: + logger.error(f"EffectColorStripStream render error: {e}") - with self._colors_lock: - self._colors = buf - - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal EffectColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False # ── Fire ───────────────────────────────────────────────────────── diff --git a/server/src/wled_controller/core/processing/live_stream.py b/server/src/wled_controller/core/processing/live_stream.py index 4b8be27..28b789e 100644 --- a/server/src/wled_controller/core/processing/live_stream.py +++ b/server/src/wled_controller/core/processing/live_stream.py @@ -129,25 +129,41 @@ class ScreenCaptureLiveStream(LiveStream): def _capture_loop(self) -> None: frame_time = 1.0 / self._fps if self._fps > 0 else 1.0 - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() - try: - frame = self._capture_stream.capture_frame() - if frame is not None: - with self._frame_lock: - self._latest_frame = frame - else: - # Small sleep when no frame available to avoid CPU spinning - time.sleep(0.001) - except Exception as e: - logger.error(f"Capture error (display={self._capture_stream.display_index}): {e}") + consecutive_errors = 0 + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + try: + frame = self._capture_stream.capture_frame() + if frame is not None: + with self._frame_lock: + self._latest_frame = frame + consecutive_errors = 0 + else: + # Small sleep when no frame available to avoid CPU spinning + time.sleep(0.001) + except Exception as e: + consecutive_errors += 1 + logger.error(f"Capture error (display={self._capture_stream.display_index}): {e}") + # Backoff on repeated errors to avoid CPU spinning + if consecutive_errors > 5: + backoff = min(1.0, 0.1 * (consecutive_errors - 5)) + time.sleep(backoff) + continue - # Throttle to target FPS - elapsed = time.perf_counter() - loop_start - remaining = frame_time - elapsed - if remaining > 0: - time.sleep(remaining) + # Throttle to target FPS + elapsed = time.perf_counter() - loop_start + remaining = frame_time - elapsed + if remaining > 0: + time.sleep(remaining) + except Exception as e: + logger.error( + f"Fatal capture loop error (display={self._capture_stream.display_index}): {e}", + exc_info=True, + ) + finally: + self._running = False class ProcessedLiveStream(LiveStream): @@ -226,79 +242,84 @@ class ProcessedLiveStream(LiveStream): fps = self.target_fps frame_time = 1.0 / fps if fps > 0 else 1.0 - with high_resolution_timer(): - while self._running: - loop_start = time.perf_counter() + try: + with high_resolution_timer(): + while self._running: + loop_start = time.perf_counter() + try: + source_frame = self._source.get_latest_frame() + if source_frame is None or source_frame is cached_source_frame: + # Idle tick — run filter chain when any filter requests idle processing + if self._has_idle_filters and cached_source_frame is not None: + src = cached_source_frame.image + h, w, c = src.shape + if _idle_src_buf is None or _idle_src_buf.shape != (h, w, c): + _idle_src_buf = np.empty((h, w, c), dtype=np.uint8) + np.copyto(_idle_src_buf, src) + idle_image = _idle_src_buf - source_frame = self._source.get_latest_frame() - if source_frame is None or source_frame is cached_source_frame: - # Idle tick — run filter chain when any filter requests idle processing - if self._has_idle_filters and cached_source_frame is not None: - src = cached_source_frame.image + for f in self._filters: + result = f.process_image(idle_image, self._image_pool) + if result is not None: + if idle_image is not _idle_src_buf: + self._image_pool.release(idle_image) + idle_image = result + + # Only publish a new frame when the filter chain produced actual + # interpolated output (idle_image advanced past the input buffer). + if idle_image is not _idle_src_buf: + processed = ScreenCapture( + image=idle_image, + width=idle_image.shape[1], + height=idle_image.shape[0], + display_index=cached_source_frame.display_index, + ) + with self._frame_lock: + self._latest_frame = processed + + elapsed = time.perf_counter() - loop_start + remaining = frame_time - elapsed + time.sleep(max(remaining, 0.001)) + continue + + cached_source_frame = source_frame + + # Reuse ring buffer slot instead of allocating a new copy each frame + src = source_frame.image h, w, c = src.shape - if _idle_src_buf is None or _idle_src_buf.shape != (h, w, c): - _idle_src_buf = np.empty((h, w, c), dtype=np.uint8) - np.copyto(_idle_src_buf, src) - idle_image = _idle_src_buf + buf = _ring[_ring_idx] + if buf is None or buf.shape != (h, w, c): + buf = np.empty((h, w, c), dtype=np.uint8) + _ring[_ring_idx] = buf + _ring_idx = (_ring_idx + 1) % 3 + + np.copyto(buf, src) + image = buf for f in self._filters: - result = f.process_image(idle_image, self._image_pool) + result = f.process_image(image, self._image_pool) if result is not None: - if idle_image is not _idle_src_buf: - self._image_pool.release(idle_image) - idle_image = result + # Release intermediate filter output back to pool + # (don't release the ring buffer itself) + if image is not buf: + self._image_pool.release(image) + image = result - # Only publish a new frame when the filter chain produced actual - # interpolated output (idle_image advanced past the input buffer). - # If every filter passed through, idle_image is still _idle_src_buf — - # leave _latest_frame unchanged so consumers that rely on object - # identity for deduplication correctly detect no new content. - if idle_image is not _idle_src_buf: - processed = ScreenCapture( - image=idle_image, - width=idle_image.shape[1], - height=idle_image.shape[0], - display_index=cached_source_frame.display_index, - ) - with self._frame_lock: - self._latest_frame = processed - - elapsed = time.perf_counter() - loop_start - remaining = frame_time - elapsed - time.sleep(max(remaining, 0.001)) - continue - - cached_source_frame = source_frame - - # Reuse ring buffer slot instead of allocating a new copy each frame - src = source_frame.image - h, w, c = src.shape - buf = _ring[_ring_idx] - if buf is None or buf.shape != (h, w, c): - buf = np.empty((h, w, c), dtype=np.uint8) - _ring[_ring_idx] = buf - _ring_idx = (_ring_idx + 1) % 3 - - np.copyto(buf, src) - image = buf - - for f in self._filters: - result = f.process_image(image, self._image_pool) - if result is not None: - # Release intermediate filter output back to pool - # (don't release the ring buffer itself) - if image is not buf: - self._image_pool.release(image) - image = result - - processed = ScreenCapture( - image=image, - width=image.shape[1], - height=image.shape[0], - display_index=source_frame.display_index, - ) - with self._frame_lock: - self._latest_frame = processed + processed = ScreenCapture( + image=image, + width=image.shape[1], + height=image.shape[0], + display_index=source_frame.display_index, + ) + with self._frame_lock: + self._latest_frame = processed + except Exception as e: + logger.error(f"Filter processing error: {e}") + time.sleep(0.01) + except Exception as e: + logger.error(f"Fatal processing loop error: {e}", exc_info=True) + finally: + self._running = False class StaticImageLiveStream(LiveStream): diff --git a/server/src/wled_controller/core/processing/mapped_stream.py b/server/src/wled_controller/core/processing/mapped_stream.py index 13d73da..4c4935f 100644 --- a/server/src/wled_controller/core/processing/mapped_stream.py +++ b/server/src/wled_controller/core/processing/mapped_stream.py @@ -152,61 +152,66 @@ class MappedColorStripStream(ColorStripStream): # ── Processing loop ───────────────────────────────────────── def _processing_loop(self) -> None: - while self._running: - loop_start = time.perf_counter() - frame_time = 1.0 / self._fps + try: + while self._running: + loop_start = time.perf_counter() + frame_time = 1.0 / self._fps - try: - target_n = self._led_count - if target_n <= 0: - time.sleep(frame_time) - continue - - result = np.zeros((target_n, 3), dtype=np.uint8) - - for i, zone in enumerate(self._zones): - if i not in self._sub_streams: + try: + target_n = self._led_count + if target_n <= 0: + time.sleep(frame_time) continue - _src_id, _consumer_id, stream = self._sub_streams[i] - colors = stream.get_latest_colors() - if colors is None: - continue + result = np.zeros((target_n, 3), dtype=np.uint8) - start = zone.get("start", 0) - end = zone.get("end", 0) - if end <= 0: - end = target_n - start = max(0, min(start, target_n)) - end = max(start, min(end, target_n)) - zone_len = end - start + for i, zone in enumerate(self._zones): + if i not in self._sub_streams: + continue - if zone_len <= 0: - continue + _src_id, _consumer_id, stream = self._sub_streams[i] + colors = stream.get_latest_colors() + if colors is None: + continue - # Resize sub-stream output to zone length if needed - if len(colors) != zone_len: - src_x = np.linspace(0, 1, len(colors)) - dst_x = np.linspace(0, 1, zone_len) - resized = np.empty((zone_len, 3), dtype=np.uint8) - for ch in range(3): - np.copyto( - resized[:, ch], - np.interp(dst_x, src_x, colors[:, ch]), - casting="unsafe", - ) - colors = resized + start = zone.get("start", 0) + end = zone.get("end", 0) + if end <= 0: + end = target_n + start = max(0, min(start, target_n)) + end = max(start, min(end, target_n)) + zone_len = end - start - if zone.get("reverse", False): - colors = colors[::-1] + if zone_len <= 0: + continue - result[start:end] = colors + # Resize sub-stream output to zone length if needed + if len(colors) != zone_len: + src_x = np.linspace(0, 1, len(colors)) + dst_x = np.linspace(0, 1, zone_len) + resized = np.empty((zone_len, 3), dtype=np.uint8) + for ch in range(3): + np.copyto( + resized[:, ch], + np.interp(dst_x, src_x, colors[:, ch]), + casting="unsafe", + ) + colors = resized - with self._colors_lock: - self._latest_colors = result + if zone.get("reverse", False): + colors = colors[::-1] - except Exception as e: - logger.error(f"MappedColorStripStream processing error: {e}", exc_info=True) + result[start:end] = colors - elapsed = time.perf_counter() - loop_start - time.sleep(max(frame_time - elapsed, 0.001)) + with self._colors_lock: + self._latest_colors = result + + except Exception as e: + logger.error(f"MappedColorStripStream processing error: {e}", exc_info=True) + + elapsed = time.perf_counter() - loop_start + time.sleep(max(frame_time - elapsed, 0.001)) + except Exception as e: + logger.error(f"Fatal MappedColorStripStream loop error: {e}", exc_info=True) + finally: + self._running = False diff --git a/server/src/wled_controller/static/js/core/api.js b/server/src/wled_controller/static/js/core/api.js index 00df93b..83cae9f 100644 --- a/server/src/wled_controller/static/js/core/api.js +++ b/server/src/wled_controller/static/js/core/api.js @@ -36,7 +36,7 @@ export async function fetchWithAuth(url, options = {}) { for (let attempt = 0; attempt < maxAttempts; attempt++) { const controller = new AbortController(); if (fetchOpts.signal) { - fetchOpts.signal.addEventListener('abort', () => controller.abort()); + fetchOpts.signal.addEventListener('abort', () => controller.abort(), { once: true }); } const timer = setTimeout(() => controller.abort(), timeout); try { diff --git a/server/src/wled_controller/static/js/core/ui.js b/server/src/wled_controller/static/js/core/ui.js index d6dda58..81431a7 100644 --- a/server/src/wled_controller/static/js/core/ui.js +++ b/server/src/wled_controller/static/js/core/ui.js @@ -56,17 +56,26 @@ export function setupBackdropClose(modal, closeFn) { modal._backdropCloseSetup = true; } +let _lockCount = 0; +let _savedScrollY = 0; + export function lockBody() { - const scrollY = window.scrollY; - document.body.style.top = `-${scrollY}px`; - document.body.classList.add('modal-open'); + if (_lockCount === 0) { + _savedScrollY = window.scrollY; + document.body.style.top = `-${_savedScrollY}px`; + document.body.classList.add('modal-open'); + } + _lockCount++; } export function unlockBody() { - const scrollY = parseInt(document.body.style.top || '0', 10) * -1; - document.body.classList.remove('modal-open'); - document.body.style.top = ''; - window.scrollTo(0, scrollY); + if (_lockCount <= 0) return; + _lockCount--; + if (_lockCount === 0) { + document.body.classList.remove('modal-open'); + document.body.style.top = ''; + window.scrollTo(0, _savedScrollY); + } } export function openLightbox(imageSrc, statsHtml) { diff --git a/server/src/wled_controller/static/js/features/targets.js b/server/src/wled_controller/static/js/features/targets.js index e52f000..ade8aab 100644 --- a/server/src/wled_controller/static/js/features/targets.js +++ b/server/src/wled_controller/static/js/features/targets.js @@ -1102,6 +1102,7 @@ function connectLedPreviewWS(targetId) { function disconnectLedPreviewWS(targetId) { const ws = ledPreviewWebSockets[targetId]; if (ws) { + ws.onclose = null; ws.close(); delete ledPreviewWebSockets[targetId]; } diff --git a/server/src/wled_controller/storage/audio_source_store.py b/server/src/wled_controller/storage/audio_source_store.py index 433cf38..a97ce13 100644 --- a/server/src/wled_controller/storage/audio_source_store.py +++ b/server/src/wled_controller/storage/audio_source_store.py @@ -11,7 +11,7 @@ from wled_controller.storage.audio_source import ( MonoAudioSource, MultichannelAudioSource, ) -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -57,21 +57,14 @@ class AudioSourceStore: def _save(self) -> None: try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - sources_dict = { - sid: source.to_dict() - for sid, source in self._sources.items() - } - data = { "version": "1.0.0", - "audio_sources": sources_dict, + "audio_sources": { + sid: source.to_dict() + for sid, source in self._sources.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save audio sources to {self.file_path}: {e}") raise diff --git a/server/src/wled_controller/storage/audio_template_store.py b/server/src/wled_controller/storage/audio_template_store.py index c10845d..c2ab52f 100644 --- a/server/src/wled_controller/storage/audio_template_store.py +++ b/server/src/wled_controller/storage/audio_template_store.py @@ -8,7 +8,7 @@ from typing import Dict, List, Optional from wled_controller.core.audio.factory import AudioEngineRegistry from wled_controller.storage.audio_template import AudioCaptureTemplate -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -93,21 +93,14 @@ class AudioTemplateStore: def _save(self) -> None: """Save all templates to file.""" try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - templates_dict = { - template_id: template.to_dict() - for template_id, template in self._templates.items() - } - data = { "version": "1.0.0", - "templates": templates_dict, + "templates": { + template_id: template.to_dict() + for template_id, template in self._templates.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save audio templates to {self.file_path}: {e}") raise @@ -168,6 +161,9 @@ class AudioTemplateStore: template = self._templates[template_id] if name is not None: + for tid, t in self._templates.items(): + if tid != template_id and t.name == name: + raise ValueError(f"Audio template with name '{name}' already exists") template.name = name if engine_type is not None: template.engine_type = engine_type diff --git a/server/src/wled_controller/storage/color_strip_store.py b/server/src/wled_controller/storage/color_strip_store.py index c4dde61..eefe0e7 100644 --- a/server/src/wled_controller/storage/color_strip_store.py +++ b/server/src/wled_controller/storage/color_strip_store.py @@ -19,7 +19,7 @@ from wled_controller.storage.color_strip_source import ( PictureColorStripSource, StaticColorStripSource, ) -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -62,21 +62,14 @@ class ColorStripStore: def _save(self) -> None: try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - sources_dict = { - sid: source.to_dict() - for sid, source in self._sources.items() - } - data = { "version": "1.0.0", - "color_strip_sources": sources_dict, + "color_strip_sources": { + sid: source.to_dict() + for sid, source in self._sources.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save color strip sources to {self.file_path}: {e}") raise diff --git a/server/src/wled_controller/storage/pattern_template_store.py b/server/src/wled_controller/storage/pattern_template_store.py index 58ab993..5e44232 100644 --- a/server/src/wled_controller/storage/pattern_template_store.py +++ b/server/src/wled_controller/storage/pattern_template_store.py @@ -8,7 +8,7 @@ from typing import Dict, List, Optional from wled_controller.storage.key_colors_picture_target import KeyColorRectangle from wled_controller.storage.pattern_template import PatternTemplate -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -88,21 +88,14 @@ class PatternTemplateStore: def _save(self) -> None: """Save all templates to file.""" try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - templates_dict = { - template_id: template.to_dict() - for template_id, template in self._templates.items() - } - data = { "version": "1.0.0", - "pattern_templates": templates_dict, + "pattern_templates": { + template_id: template.to_dict() + for template_id, template in self._templates.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save pattern templates to {self.file_path}: {e}") raise @@ -180,6 +173,9 @@ class PatternTemplateStore: template = self._templates[template_id] if name is not None: + for tid, t in self._templates.items(): + if tid != template_id and t.name == name: + raise ValueError(f"Pattern template with name '{name}' already exists") template.name = name if rectangles is not None: template.rectangles = rectangles diff --git a/server/src/wled_controller/storage/picture_source_store.py b/server/src/wled_controller/storage/picture_source_store.py index a3fbc32..a4c03a6 100644 --- a/server/src/wled_controller/storage/picture_source_store.py +++ b/server/src/wled_controller/storage/picture_source_store.py @@ -12,7 +12,7 @@ from wled_controller.storage.picture_source import ( ProcessedPictureSource, StaticImagePictureSource, ) -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -68,21 +68,14 @@ class PictureSourceStore: def _save(self) -> None: """Save all streams to file.""" try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - streams_dict = { - stream_id: stream.to_dict() - for stream_id, stream in self._streams.items() - } - data = { "version": "1.0.0", - "picture_sources": streams_dict, + "picture_sources": { + stream_id: stream.to_dict() + for stream_id, stream in self._streams.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save picture sources to {self.file_path}: {e}") raise diff --git a/server/src/wled_controller/storage/picture_target_store.py b/server/src/wled_controller/storage/picture_target_store.py index 740c1f4..80038d0 100644 --- a/server/src/wled_controller/storage/picture_target_store.py +++ b/server/src/wled_controller/storage/picture_target_store.py @@ -12,7 +12,7 @@ from wled_controller.storage.key_colors_picture_target import ( KeyColorsSettings, KeyColorsPictureTarget, ) -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -63,21 +63,14 @@ class PictureTargetStore: def _save(self) -> None: """Save all targets to file.""" try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - targets_dict = { - target_id: target.to_dict() - for target_id, target in self._targets.items() - } - data = { "version": "1.0.0", - "picture_targets": targets_dict, + "picture_targets": { + target_id: target.to_dict() + for target_id, target in self._targets.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save picture targets to {self.file_path}: {e}") raise diff --git a/server/src/wled_controller/storage/postprocessing_template_store.py b/server/src/wled_controller/storage/postprocessing_template_store.py index 0299eb9..ff440c1 100644 --- a/server/src/wled_controller/storage/postprocessing_template_store.py +++ b/server/src/wled_controller/storage/postprocessing_template_store.py @@ -10,7 +10,7 @@ from wled_controller.core.filters.filter_instance import FilterInstance from wled_controller.core.filters.registry import FilterRegistry from wled_controller.storage.picture_source import ProcessedPictureSource from wled_controller.storage.postprocessing_template import PostprocessingTemplate -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -92,21 +92,14 @@ class PostprocessingTemplateStore: def _save(self) -> None: """Save all templates to file.""" try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - templates_dict = { - template_id: template.to_dict() - for template_id, template in self._templates.items() - } - data = { "version": "2.0.0", - "postprocessing_templates": templates_dict, + "postprocessing_templates": { + template_id: template.to_dict() + for template_id, template in self._templates.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save postprocessing templates to {self.file_path}: {e}") raise @@ -189,6 +182,9 @@ class PostprocessingTemplateStore: template = self._templates[template_id] if name is not None: + for tid, t in self._templates.items(): + if tid != template_id and t.name == name: + raise ValueError(f"Postprocessing template with name '{name}' already exists") template.name = name if filters is not None: # Validate filter IDs diff --git a/server/src/wled_controller/storage/profile_store.py b/server/src/wled_controller/storage/profile_store.py index f167bdf..5596924 100644 --- a/server/src/wled_controller/storage/profile_store.py +++ b/server/src/wled_controller/storage/profile_store.py @@ -7,7 +7,7 @@ from pathlib import Path from typing import Dict, List, Optional from wled_controller.storage.profile import Condition, Profile -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -49,18 +49,13 @@ class ProfileStore: def _save(self) -> None: try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - data = { "version": "1.0.0", "profiles": { pid: p.to_dict() for pid, p in self._profiles.items() }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save profiles to {self.file_path}: {e}") raise @@ -81,6 +76,10 @@ class ProfileStore: conditions: Optional[List[Condition]] = None, target_ids: Optional[List[str]] = None, ) -> Profile: + for p in self._profiles.values(): + if p.name == name: + raise ValueError(f"Profile with name '{name}' already exists") + profile_id = f"prof_{uuid.uuid4().hex[:8]}" now = datetime.utcnow() @@ -116,6 +115,9 @@ class ProfileStore: profile = self._profiles[profile_id] if name is not None: + for pid, p in self._profiles.items(): + if pid != profile_id and p.name == name: + raise ValueError(f"Profile with name '{name}' already exists") profile.name = name if enabled is not None: profile.enabled = enabled diff --git a/server/src/wled_controller/storage/template_store.py b/server/src/wled_controller/storage/template_store.py index 054ea0f..2b64f6b 100644 --- a/server/src/wled_controller/storage/template_store.py +++ b/server/src/wled_controller/storage/template_store.py @@ -8,7 +8,7 @@ from typing import Dict, List, Optional from wled_controller.core.capture_engines.factory import EngineRegistry from wled_controller.storage.template import CaptureTemplate -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -95,23 +95,14 @@ class TemplateStore: def _save(self) -> None: """Save all templates to file.""" try: - # Ensure directory exists - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - templates_dict = { - template_id: template.to_dict() - for template_id, template in self._templates.items() - } - data = { "version": "1.0.0", - "templates": templates_dict, + "templates": { + template_id: template.to_dict() + for template_id, template in self._templates.items() + }, } - - # Write to file - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save templates to {self.file_path}: {e}") raise @@ -218,6 +209,9 @@ class TemplateStore: # Update fields if name is not None: + for tid, t in self._templates.items(): + if tid != template_id and t.name == name: + raise ValueError(f"Template with name '{name}' already exists") template.name = name if engine_type is not None: template.engine_type = engine_type diff --git a/server/src/wled_controller/storage/value_source_store.py b/server/src/wled_controller/storage/value_source_store.py index 302957c..b3dcd44 100644 --- a/server/src/wled_controller/storage/value_source_store.py +++ b/server/src/wled_controller/storage/value_source_store.py @@ -13,7 +13,7 @@ from wled_controller.storage.value_source import ( StaticValueSource, ValueSource, ) -from wled_controller.utils import get_logger +from wled_controller.utils import atomic_write_json, get_logger logger = get_logger(__name__) @@ -59,21 +59,14 @@ class ValueSourceStore: def _save(self) -> None: try: - self.file_path.parent.mkdir(parents=True, exist_ok=True) - - sources_dict = { - sid: source.to_dict() - for sid, source in self._sources.items() - } - data = { "version": "1.0.0", - "value_sources": sources_dict, + "value_sources": { + sid: source.to_dict() + for sid, source in self._sources.items() + }, } - - with open(self.file_path, "w", encoding="utf-8") as f: - json.dump(data, f, indent=2, ensure_ascii=False) - + atomic_write_json(self.file_path, data) except Exception as e: logger.error(f"Failed to save value sources to {self.file_path}: {e}") raise diff --git a/server/src/wled_controller/utils/__init__.py b/server/src/wled_controller/utils/__init__.py index 4b50806..7569261 100644 --- a/server/src/wled_controller/utils/__init__.py +++ b/server/src/wled_controller/utils/__init__.py @@ -1,7 +1,8 @@ """Utility functions and helpers.""" +from .file_ops import atomic_write_json from .logger import setup_logging, get_logger from .monitor_names import get_monitor_names, get_monitor_name, get_monitor_refresh_rates from .timer import high_resolution_timer -__all__ = ["setup_logging", "get_logger", "get_monitor_names", "get_monitor_name", "get_monitor_refresh_rates", "high_resolution_timer"] +__all__ = ["atomic_write_json", "setup_logging", "get_logger", "get_monitor_names", "get_monitor_name", "get_monitor_refresh_rates", "high_resolution_timer"] diff --git a/server/src/wled_controller/utils/file_ops.py b/server/src/wled_controller/utils/file_ops.py new file mode 100644 index 0000000..0965820 --- /dev/null +++ b/server/src/wled_controller/utils/file_ops.py @@ -0,0 +1,34 @@ +"""Atomic file write utilities.""" + +import json +import os +import tempfile +from pathlib import Path + + +def atomic_write_json(file_path: Path, data: dict, indent: int = 2) -> None: + """Write JSON data to file atomically via temp file + rename. + + Prevents data corruption if the process crashes or loses power + mid-write. The rename operation is atomic on most filesystems. + """ + file_path = Path(file_path) + file_path.parent.mkdir(parents=True, exist_ok=True) + + # Write to a temp file in the same directory (same filesystem for atomic rename) + fd, tmp_path = tempfile.mkstemp( + dir=file_path.parent, + prefix=f".{file_path.stem}_", + suffix=".tmp", + ) + try: + with os.fdopen(fd, "w", encoding="utf-8") as f: + json.dump(data, f, indent=indent, ensure_ascii=False) + os.replace(tmp_path, file_path) + except BaseException: + # Clean up temp file on any error + try: + os.unlink(tmp_path) + except OSError: + pass + raise