Refactor HAOS integration to use shared core library (Phase 2)
Some checks failed
Validate / Hassfest (push) Has been cancelled
Some checks failed
Validate / Hassfest (push) Has been cancelled
Wire the integration to delegate all HA-independent logic to immich-watcher-core, eliminating ~2300 lines of duplicated code. Changes: - const.py: Import shared constants from core, keep HA-specific ones - storage.py: Create HAStorageBackend adapter wrapping HA's Store, use core TelegramFileCache and NotificationQueue via adapter - coordinator.py: Delegate to core ImmichClient for API calls, detect_album_changes() for change detection, and asset_utils for filtering/sorting/URL building. Keep HA-specific event firing. - sensor.py: Replace ~1300 lines of Telegram code with 15-line delegation to core TelegramClient. Keep entity classes unchanged. - __init__.py: Use factory functions for creating core instances with HA storage backends - manifest.json: Add immich-watcher-core dependency Integration line count: 3600 -> 1295 lines (-64%) Zero behavior changes for end users. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -25,7 +25,13 @@ from .const import (
|
||||
PLATFORMS,
|
||||
)
|
||||
from .coordinator import ImmichAlbumWatcherCoordinator
|
||||
from .storage import ImmichAlbumStorage, NotificationQueue, TelegramFileCache
|
||||
from .storage import (
|
||||
ImmichAlbumStorage,
|
||||
NotificationQueue,
|
||||
TelegramFileCache,
|
||||
create_notification_queue,
|
||||
create_telegram_cache,
|
||||
)
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
@@ -80,16 +86,16 @@ async def async_setup_entry(hass: HomeAssistant, entry: ImmichConfigEntry) -> bo
|
||||
# TTL is in hours from config, convert to seconds
|
||||
cache_ttl_seconds = telegram_cache_ttl * 60 * 60
|
||||
# URL-based cache for non-Immich URLs or URLs without extractable asset IDs
|
||||
telegram_cache = TelegramFileCache(hass, entry.entry_id, ttl_seconds=cache_ttl_seconds)
|
||||
telegram_cache = create_telegram_cache(hass, entry.entry_id, ttl_seconds=cache_ttl_seconds)
|
||||
await telegram_cache.async_load()
|
||||
# Asset ID-based cache for Immich URLs — uses thumbhash validation instead of TTL
|
||||
telegram_asset_cache = TelegramFileCache(
|
||||
telegram_asset_cache = create_telegram_cache(
|
||||
hass, f"{entry.entry_id}_assets", use_thumbhash=True
|
||||
)
|
||||
await telegram_asset_cache.async_load()
|
||||
|
||||
# Create notification queue for quiet hours
|
||||
notification_queue = NotificationQueue(hass, entry.entry_id)
|
||||
notification_queue = create_notification_queue(hass, entry.entry_id)
|
||||
await notification_queue.async_load()
|
||||
|
||||
# Store hub reference
|
||||
|
||||
@@ -1,8 +1,59 @@
|
||||
"""Constants for the Immich Album Watcher integration."""
|
||||
|
||||
from datetime import timedelta
|
||||
from typing import Final
|
||||
|
||||
# Re-export shared constants from core library
|
||||
from immich_watcher_core.constants import ( # noqa: F401
|
||||
ASSET_TYPE_IMAGE,
|
||||
ASSET_TYPE_VIDEO,
|
||||
ATTR_ADDED_ASSETS,
|
||||
ATTR_ADDED_COUNT,
|
||||
ATTR_ALBUM_ID,
|
||||
ATTR_ALBUM_NAME,
|
||||
ATTR_ALBUM_PROTECTED_PASSWORD,
|
||||
ATTR_ALBUM_PROTECTED_URL,
|
||||
ATTR_ALBUM_URL,
|
||||
ATTR_ALBUM_URLS,
|
||||
ATTR_ASSET_CITY,
|
||||
ATTR_ASSET_COUNT,
|
||||
ATTR_ASSET_COUNTRY,
|
||||
ATTR_ASSET_CREATED,
|
||||
ATTR_ASSET_DESCRIPTION,
|
||||
ATTR_ASSET_DOWNLOAD_URL,
|
||||
ATTR_ASSET_FILENAME,
|
||||
ATTR_ASSET_IS_FAVORITE,
|
||||
ATTR_ASSET_LATITUDE,
|
||||
ATTR_ASSET_LONGITUDE,
|
||||
ATTR_ASSET_OWNER,
|
||||
ATTR_ASSET_OWNER_ID,
|
||||
ATTR_ASSET_PLAYBACK_URL,
|
||||
ATTR_ASSET_RATING,
|
||||
ATTR_ASSET_STATE,
|
||||
ATTR_ASSET_TYPE,
|
||||
ATTR_ASSET_URL,
|
||||
ATTR_CHANGE_TYPE,
|
||||
ATTR_CREATED_AT,
|
||||
ATTR_HUB_NAME,
|
||||
ATTR_LAST_UPDATED,
|
||||
ATTR_NEW_NAME,
|
||||
ATTR_NEW_SHARED,
|
||||
ATTR_OLD_NAME,
|
||||
ATTR_OLD_SHARED,
|
||||
ATTR_OWNER,
|
||||
ATTR_PEOPLE,
|
||||
ATTR_PHOTO_COUNT,
|
||||
ATTR_REMOVED_ASSETS,
|
||||
ATTR_REMOVED_COUNT,
|
||||
ATTR_SHARED,
|
||||
ATTR_THUMBNAIL_URL,
|
||||
ATTR_VIDEO_COUNT,
|
||||
DEFAULT_SCAN_INTERVAL,
|
||||
DEFAULT_SHARE_PASSWORD,
|
||||
DEFAULT_TELEGRAM_CACHE_TTL,
|
||||
NEW_ASSETS_RESET_DELAY,
|
||||
)
|
||||
|
||||
# HA-specific constants
|
||||
DOMAIN: Final = "immich_album_watcher"
|
||||
|
||||
# Configuration keys
|
||||
@@ -19,13 +70,7 @@ CONF_TELEGRAM_CACHE_TTL: Final = "telegram_cache_ttl"
|
||||
# Subentry type
|
||||
SUBENTRY_TYPE_ALBUM: Final = "album"
|
||||
|
||||
# Defaults
|
||||
DEFAULT_SCAN_INTERVAL: Final = 60 # seconds
|
||||
DEFAULT_TELEGRAM_CACHE_TTL: Final = 48 # hours
|
||||
NEW_ASSETS_RESET_DELAY: Final = 300 # 5 minutes
|
||||
DEFAULT_SHARE_PASSWORD: Final = "immich123"
|
||||
|
||||
# Events
|
||||
# HA event names (prefixed with domain)
|
||||
EVENT_ALBUM_CHANGED: Final = f"{DOMAIN}_album_changed"
|
||||
EVENT_ASSETS_ADDED: Final = f"{DOMAIN}_assets_added"
|
||||
EVENT_ASSETS_REMOVED: Final = f"{DOMAIN}_assets_removed"
|
||||
@@ -33,53 +78,6 @@ EVENT_ALBUM_RENAMED: Final = f"{DOMAIN}_album_renamed"
|
||||
EVENT_ALBUM_DELETED: Final = f"{DOMAIN}_album_deleted"
|
||||
EVENT_ALBUM_SHARING_CHANGED: Final = f"{DOMAIN}_album_sharing_changed"
|
||||
|
||||
# Attributes
|
||||
ATTR_HUB_NAME: Final = "hub_name"
|
||||
ATTR_ALBUM_ID: Final = "album_id"
|
||||
ATTR_ALBUM_NAME: Final = "album_name"
|
||||
ATTR_ALBUM_URL: Final = "album_url"
|
||||
ATTR_ALBUM_URLS: Final = "album_urls"
|
||||
ATTR_ALBUM_PROTECTED_URL: Final = "album_protected_url"
|
||||
ATTR_ALBUM_PROTECTED_PASSWORD: Final = "album_protected_password"
|
||||
ATTR_ASSET_COUNT: Final = "asset_count"
|
||||
ATTR_PHOTO_COUNT: Final = "photo_count"
|
||||
ATTR_VIDEO_COUNT: Final = "video_count"
|
||||
ATTR_ADDED_COUNT: Final = "added_count"
|
||||
ATTR_REMOVED_COUNT: Final = "removed_count"
|
||||
ATTR_ADDED_ASSETS: Final = "added_assets"
|
||||
ATTR_REMOVED_ASSETS: Final = "removed_assets"
|
||||
ATTR_CHANGE_TYPE: Final = "change_type"
|
||||
ATTR_LAST_UPDATED: Final = "last_updated_at"
|
||||
ATTR_CREATED_AT: Final = "created_at"
|
||||
ATTR_THUMBNAIL_URL: Final = "thumbnail_url"
|
||||
ATTR_SHARED: Final = "shared"
|
||||
ATTR_OWNER: Final = "owner"
|
||||
ATTR_PEOPLE: Final = "people"
|
||||
ATTR_OLD_NAME: Final = "old_name"
|
||||
ATTR_NEW_NAME: Final = "new_name"
|
||||
ATTR_OLD_SHARED: Final = "old_shared"
|
||||
ATTR_NEW_SHARED: Final = "new_shared"
|
||||
ATTR_ASSET_TYPE: Final = "type"
|
||||
ATTR_ASSET_FILENAME: Final = "filename"
|
||||
ATTR_ASSET_CREATED: Final = "created_at"
|
||||
ATTR_ASSET_OWNER: Final = "owner"
|
||||
ATTR_ASSET_OWNER_ID: Final = "owner_id"
|
||||
ATTR_ASSET_URL: Final = "url"
|
||||
ATTR_ASSET_DOWNLOAD_URL: Final = "download_url"
|
||||
ATTR_ASSET_PLAYBACK_URL: Final = "playback_url"
|
||||
ATTR_ASSET_DESCRIPTION: Final = "description"
|
||||
ATTR_ASSET_IS_FAVORITE: Final = "is_favorite"
|
||||
ATTR_ASSET_RATING: Final = "rating"
|
||||
ATTR_ASSET_LATITUDE: Final = "latitude"
|
||||
ATTR_ASSET_LONGITUDE: Final = "longitude"
|
||||
ATTR_ASSET_CITY: Final = "city"
|
||||
ATTR_ASSET_STATE: Final = "state"
|
||||
ATTR_ASSET_COUNTRY: Final = "country"
|
||||
|
||||
# Asset types
|
||||
ASSET_TYPE_IMAGE: Final = "IMAGE"
|
||||
ASSET_TYPE_VIDEO: Final = "VIDEO"
|
||||
|
||||
# Platforms
|
||||
PLATFORMS: Final = ["sensor", "binary_sensor", "camera", "text", "button"]
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -7,6 +7,6 @@
|
||||
"documentation": "https://github.com/DolgolyovAlexei/haos-hacs-immich-album-watcher",
|
||||
"iot_class": "cloud_polling",
|
||||
"issue_tracker": "https://github.com/DolgolyovAlexei/haos-hacs-immich-album-watcher/issues",
|
||||
"requirements": [],
|
||||
"requirements": ["immich-watcher-core==0.1.0"],
|
||||
"version": "2.8.0"
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -9,17 +9,51 @@ from typing import Any
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.storage import Store
|
||||
|
||||
from immich_watcher_core.notifications.queue import (
|
||||
NotificationQueue as CoreNotificationQueue,
|
||||
)
|
||||
from immich_watcher_core.telegram.cache import TelegramFileCache as CoreTelegramFileCache
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
STORAGE_VERSION = 1
|
||||
STORAGE_KEY_PREFIX = "immich_album_watcher"
|
||||
|
||||
# Default TTL for Telegram file_id cache (48 hours in seconds)
|
||||
DEFAULT_TELEGRAM_CACHE_TTL = 48 * 60 * 60
|
||||
|
||||
class HAStorageBackend:
|
||||
"""Home Assistant storage backend adapter.
|
||||
|
||||
Wraps homeassistant.helpers.storage.Store to satisfy the
|
||||
StorageBackend protocol from immich_watcher_core.
|
||||
"""
|
||||
|
||||
def __init__(self, hass: HomeAssistant, key: str) -> None:
|
||||
"""Initialize with HA store.
|
||||
|
||||
Args:
|
||||
hass: Home Assistant instance
|
||||
key: Storage key (e.g. "immich_album_watcher.telegram_cache.xxx")
|
||||
"""
|
||||
self._store: Store[dict[str, Any]] = Store(hass, STORAGE_VERSION, key)
|
||||
|
||||
async def load(self) -> dict[str, Any] | None:
|
||||
"""Load data from HA storage."""
|
||||
return await self._store.async_load()
|
||||
|
||||
async def save(self, data: dict[str, Any]) -> None:
|
||||
"""Save data to HA storage."""
|
||||
await self._store.async_save(data)
|
||||
|
||||
async def remove(self) -> None:
|
||||
"""Remove all stored data."""
|
||||
await self._store.async_remove()
|
||||
|
||||
|
||||
class ImmichAlbumStorage:
|
||||
"""Handles persistence of album state across restarts."""
|
||||
"""Handles persistence of album state across restarts.
|
||||
|
||||
This remains HA-native as it manages HA-specific album tracking state.
|
||||
"""
|
||||
|
||||
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
|
||||
"""Initialize the storage."""
|
||||
@@ -68,260 +102,40 @@ class ImmichAlbumStorage:
|
||||
self._data = None
|
||||
|
||||
|
||||
class TelegramFileCache:
|
||||
"""Cache for Telegram file_ids to avoid re-uploading media.
|
||||
# Convenience factory functions for creating core classes with HA backends
|
||||
|
||||
When a file is uploaded to Telegram, it returns a file_id that can be reused
|
||||
to send the same file without re-uploading. This cache stores these file_ids
|
||||
keyed by the source URL or asset ID.
|
||||
|
||||
Supports two validation modes:
|
||||
- TTL mode (default): entries expire after a configured time-to-live
|
||||
- Thumbhash mode: entries are validated by comparing stored thumbhash with
|
||||
the current asset thumbhash from Immich
|
||||
def create_telegram_cache(
|
||||
hass: HomeAssistant,
|
||||
entry_id: str,
|
||||
ttl_seconds: int = 48 * 60 * 60,
|
||||
use_thumbhash: bool = False,
|
||||
) -> CoreTelegramFileCache:
|
||||
"""Create a TelegramFileCache with HA storage backend.
|
||||
|
||||
Args:
|
||||
hass: Home Assistant instance
|
||||
entry_id: Config entry ID for scoping
|
||||
ttl_seconds: TTL for cache entries (TTL mode only)
|
||||
use_thumbhash: Use thumbhash validation instead of TTL
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
entry_id: str,
|
||||
ttl_seconds: int = DEFAULT_TELEGRAM_CACHE_TTL,
|
||||
use_thumbhash: bool = False,
|
||||
) -> None:
|
||||
"""Initialize the Telegram file cache.
|
||||
|
||||
Args:
|
||||
hass: Home Assistant instance
|
||||
entry_id: Config entry ID for scoping the cache (per hub)
|
||||
ttl_seconds: Time-to-live for cache entries in seconds (TTL mode only)
|
||||
use_thumbhash: Use thumbhash-based validation instead of TTL
|
||||
"""
|
||||
self._store: Store[dict[str, Any]] = Store(
|
||||
hass, STORAGE_VERSION, f"{STORAGE_KEY_PREFIX}.telegram_cache.{entry_id}"
|
||||
)
|
||||
self._data: dict[str, Any] | None = None
|
||||
self._ttl_seconds = ttl_seconds
|
||||
self._use_thumbhash = use_thumbhash
|
||||
|
||||
async def async_load(self) -> None:
|
||||
"""Load cache data from storage."""
|
||||
self._data = await self._store.async_load() or {"files": {}}
|
||||
# Clean up expired entries on load (TTL mode only)
|
||||
await self._cleanup_expired()
|
||||
mode = "thumbhash" if self._use_thumbhash else "TTL"
|
||||
_LOGGER.debug(
|
||||
"Loaded Telegram file cache with %d entries (mode: %s)",
|
||||
len(self._data.get("files", {})),
|
||||
mode,
|
||||
)
|
||||
|
||||
# Maximum number of entries to keep in thumbhash mode to prevent unbounded growth
|
||||
THUMBHASH_MAX_ENTRIES = 2000
|
||||
|
||||
async def _cleanup_expired(self) -> None:
|
||||
"""Remove expired cache entries (TTL mode) or trim old entries (thumbhash mode)."""
|
||||
if self._use_thumbhash:
|
||||
files = self._data.get("files", {}) if self._data else {}
|
||||
if len(files) > self.THUMBHASH_MAX_ENTRIES:
|
||||
sorted_keys = sorted(
|
||||
files, key=lambda k: files[k].get("cached_at", "")
|
||||
)
|
||||
keys_to_remove = sorted_keys[: len(files) - self.THUMBHASH_MAX_ENTRIES]
|
||||
for key in keys_to_remove:
|
||||
del files[key]
|
||||
await self._store.async_save(self._data)
|
||||
_LOGGER.debug(
|
||||
"Trimmed thumbhash cache from %d to %d entries",
|
||||
len(keys_to_remove) + self.THUMBHASH_MAX_ENTRIES,
|
||||
self.THUMBHASH_MAX_ENTRIES,
|
||||
)
|
||||
return
|
||||
|
||||
if not self._data or "files" not in self._data:
|
||||
return
|
||||
|
||||
now = datetime.now(timezone.utc)
|
||||
expired_keys = []
|
||||
|
||||
for url, entry in self._data["files"].items():
|
||||
cached_at_str = entry.get("cached_at")
|
||||
if cached_at_str:
|
||||
cached_at = datetime.fromisoformat(cached_at_str)
|
||||
age_seconds = (now - cached_at).total_seconds()
|
||||
if age_seconds > self._ttl_seconds:
|
||||
expired_keys.append(url)
|
||||
|
||||
if expired_keys:
|
||||
for key in expired_keys:
|
||||
del self._data["files"][key]
|
||||
await self._store.async_save(self._data)
|
||||
_LOGGER.debug("Cleaned up %d expired Telegram cache entries", len(expired_keys))
|
||||
|
||||
def get(self, key: str, thumbhash: str | None = None) -> dict[str, Any] | None:
|
||||
"""Get cached file_id for a key.
|
||||
|
||||
Args:
|
||||
key: The cache key (URL or asset ID)
|
||||
thumbhash: Current thumbhash for validation (thumbhash mode only).
|
||||
If provided, compares with stored thumbhash. Mismatch = cache miss.
|
||||
|
||||
Returns:
|
||||
Dict with 'file_id' and 'type' if cached and valid, None otherwise
|
||||
"""
|
||||
if not self._data or "files" not in self._data:
|
||||
return None
|
||||
|
||||
entry = self._data["files"].get(key)
|
||||
if not entry:
|
||||
return None
|
||||
|
||||
if self._use_thumbhash:
|
||||
# Thumbhash-based validation
|
||||
if thumbhash is not None:
|
||||
stored_thumbhash = entry.get("thumbhash")
|
||||
if stored_thumbhash and stored_thumbhash != thumbhash:
|
||||
_LOGGER.debug(
|
||||
"Cache miss for %s: thumbhash changed, removing stale entry",
|
||||
key[:36],
|
||||
)
|
||||
del self._data["files"][key]
|
||||
return None
|
||||
# If no thumbhash provided (asset not in monitored album),
|
||||
# return cached entry anyway — self-heals on Telegram rejection
|
||||
else:
|
||||
# TTL-based validation
|
||||
cached_at_str = entry.get("cached_at")
|
||||
if cached_at_str:
|
||||
cached_at = datetime.fromisoformat(cached_at_str)
|
||||
age_seconds = (datetime.now(timezone.utc) - cached_at).total_seconds()
|
||||
if age_seconds > self._ttl_seconds:
|
||||
return None
|
||||
|
||||
return {
|
||||
"file_id": entry.get("file_id"),
|
||||
"type": entry.get("type"),
|
||||
}
|
||||
|
||||
async def async_set(
|
||||
self, key: str, file_id: str, media_type: str, thumbhash: str | None = None
|
||||
) -> None:
|
||||
"""Store a file_id for a key.
|
||||
|
||||
Args:
|
||||
key: The cache key (URL or asset ID)
|
||||
file_id: The Telegram file_id
|
||||
media_type: The type of media ('photo', 'video', 'document')
|
||||
thumbhash: Current thumbhash to store alongside file_id (thumbhash mode only)
|
||||
"""
|
||||
if self._data is None:
|
||||
self._data = {"files": {}}
|
||||
|
||||
entry_data: dict[str, Any] = {
|
||||
"file_id": file_id,
|
||||
"type": media_type,
|
||||
"cached_at": datetime.now(timezone.utc).isoformat(),
|
||||
}
|
||||
if thumbhash is not None:
|
||||
entry_data["thumbhash"] = thumbhash
|
||||
|
||||
self._data["files"][key] = entry_data
|
||||
await self._store.async_save(self._data)
|
||||
_LOGGER.debug("Cached Telegram file_id for key (type: %s)", media_type)
|
||||
|
||||
async def async_set_many(
|
||||
self, entries: list[tuple[str, str, str, str | None]]
|
||||
) -> None:
|
||||
"""Store multiple file_ids in a single disk write.
|
||||
|
||||
Args:
|
||||
entries: List of (key, file_id, media_type, thumbhash) tuples
|
||||
"""
|
||||
if not entries:
|
||||
return
|
||||
|
||||
if self._data is None:
|
||||
self._data = {"files": {}}
|
||||
|
||||
now_iso = datetime.now(timezone.utc).isoformat()
|
||||
for key, file_id, media_type, thumbhash in entries:
|
||||
entry_data: dict[str, Any] = {
|
||||
"file_id": file_id,
|
||||
"type": media_type,
|
||||
"cached_at": now_iso,
|
||||
}
|
||||
if thumbhash is not None:
|
||||
entry_data["thumbhash"] = thumbhash
|
||||
self._data["files"][key] = entry_data
|
||||
|
||||
await self._store.async_save(self._data)
|
||||
_LOGGER.debug("Batch cached %d Telegram file_ids", len(entries))
|
||||
|
||||
async def async_remove(self) -> None:
|
||||
"""Remove all cache data."""
|
||||
await self._store.async_remove()
|
||||
self._data = None
|
||||
suffix = f"_assets" if use_thumbhash else ""
|
||||
backend = HAStorageBackend(
|
||||
hass, f"{STORAGE_KEY_PREFIX}.telegram_cache.{entry_id}{suffix}"
|
||||
)
|
||||
return CoreTelegramFileCache(backend, ttl_seconds=ttl_seconds, use_thumbhash=use_thumbhash)
|
||||
|
||||
|
||||
class NotificationQueue:
|
||||
"""Persistent queue for notifications deferred during quiet hours.
|
||||
def create_notification_queue(
|
||||
hass: HomeAssistant, entry_id: str
|
||||
) -> CoreNotificationQueue:
|
||||
"""Create a NotificationQueue with HA storage backend."""
|
||||
backend = HAStorageBackend(
|
||||
hass, f"{STORAGE_KEY_PREFIX}.notification_queue.{entry_id}"
|
||||
)
|
||||
return CoreNotificationQueue(backend)
|
||||
|
||||
Stores full service call parameters so notifications can be replayed
|
||||
exactly as they were originally called.
|
||||
"""
|
||||
|
||||
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
|
||||
"""Initialize the notification queue."""
|
||||
self._store: Store[dict[str, Any]] = Store(
|
||||
hass, STORAGE_VERSION, f"{STORAGE_KEY_PREFIX}.notification_queue.{entry_id}"
|
||||
)
|
||||
self._data: dict[str, Any] | None = None
|
||||
|
||||
async def async_load(self) -> None:
|
||||
"""Load queue data from storage."""
|
||||
self._data = await self._store.async_load() or {"queue": []}
|
||||
_LOGGER.debug(
|
||||
"Loaded notification queue with %d items",
|
||||
len(self._data.get("queue", [])),
|
||||
)
|
||||
|
||||
async def async_enqueue(self, notification_params: dict[str, Any]) -> None:
|
||||
"""Add a notification to the queue."""
|
||||
if self._data is None:
|
||||
self._data = {"queue": []}
|
||||
|
||||
self._data["queue"].append({
|
||||
"params": notification_params,
|
||||
"queued_at": datetime.now(timezone.utc).isoformat(),
|
||||
})
|
||||
await self._store.async_save(self._data)
|
||||
_LOGGER.debug("Queued notification during quiet hours (total: %d)", len(self._data["queue"]))
|
||||
|
||||
def get_all(self) -> list[dict[str, Any]]:
|
||||
"""Get all queued notifications."""
|
||||
if not self._data:
|
||||
return []
|
||||
return list(self._data.get("queue", []))
|
||||
|
||||
def has_pending(self) -> bool:
|
||||
"""Check if there are pending notifications."""
|
||||
return bool(self._data and self._data.get("queue"))
|
||||
|
||||
async def async_remove_indices(self, indices: list[int]) -> None:
|
||||
"""Remove specific items by index (indices must be in descending order)."""
|
||||
if not self._data or not indices:
|
||||
return
|
||||
for idx in indices:
|
||||
if 0 <= idx < len(self._data["queue"]):
|
||||
del self._data["queue"][idx]
|
||||
await self._store.async_save(self._data)
|
||||
|
||||
async def async_clear(self) -> None:
|
||||
"""Clear all queued notifications."""
|
||||
if self._data:
|
||||
self._data["queue"] = []
|
||||
await self._store.async_save(self._data)
|
||||
|
||||
async def async_remove(self) -> None:
|
||||
"""Remove all queue data."""
|
||||
await self._store.async_remove()
|
||||
self._data = None
|
||||
# Re-export core types for backward compatibility
|
||||
TelegramFileCache = CoreTelegramFileCache
|
||||
NotificationQueue = CoreNotificationQueue
|
||||
|
||||
75
plans/phase-2-haos-refactor.md
Normal file
75
plans/phase-2-haos-refactor.md
Normal file
@@ -0,0 +1,75 @@
|
||||
# Phase 2: Wire Core Library into HAOS Integration
|
||||
|
||||
**Status**: In progress
|
||||
**Parent**: [primary-plan.md](primary-plan.md)
|
||||
|
||||
---
|
||||
|
||||
## Goal
|
||||
|
||||
Refactor the HAOS integration to delegate to `immich-watcher-core` for all HA-independent logic, reducing duplication and preparing for the standalone server.
|
||||
|
||||
---
|
||||
|
||||
## Important: HACS Compatibility
|
||||
|
||||
HACS requires `custom_components/<domain>/` at the repository root. We **cannot** move it to `packages/haos/`. Instead:
|
||||
|
||||
- `custom_components/` stays at repo root
|
||||
- The integration imports from `immich_watcher_core` (the core library)
|
||||
- `manifest.json` lists `immich-watcher-core` in `requirements` (for future PyPI publish)
|
||||
- During development, `pip install -e packages/core` makes imports work
|
||||
- For HACS distribution, we'll publish the core to PyPI
|
||||
|
||||
---
|
||||
|
||||
## Tasks
|
||||
|
||||
### 1. Update manifest.json `[ ]`
|
||||
- Add `immich-watcher-core` to requirements list
|
||||
- Do NOT bump version (only plans/core changed, not integration content yet)
|
||||
|
||||
### 2. Refactor const.py `[ ]`
|
||||
- Import shared constants from `immich_watcher_core.constants`
|
||||
- Keep HA-specific constants (DOMAIN, CONF_*, PLATFORMS, SERVICE_*) local
|
||||
- Re-export shared constants for backward compatibility with other integration files
|
||||
|
||||
### 3. Refactor storage.py `[ ]`
|
||||
- Create `HAStorageBackend` adapter wrapping `homeassistant.helpers.storage.Store`
|
||||
that satisfies `StorageBackend` protocol from core
|
||||
- Replace `TelegramFileCache` with core's version using `HAStorageBackend`
|
||||
- Replace `NotificationQueue` with core's version using `HAStorageBackend`
|
||||
- Keep `ImmichAlbumStorage` as-is (HA-specific album state management)
|
||||
|
||||
### 4. Refactor coordinator.py `[ ]`
|
||||
- Remove dataclass definitions (SharedLinkInfo, AssetInfo, AlbumData, AlbumChange) — import from core
|
||||
- Replace Immich API methods with `ImmichClient` from core
|
||||
- Replace `_detect_change()` with `detect_album_changes()` from core
|
||||
- Replace `_build_asset_detail()` and URL helpers with `asset_utils` from core
|
||||
- Replace `async_get_assets()` filtering/sorting with `filter_assets()` + `sort_assets()` from core
|
||||
- Keep HA-specific: `DataUpdateCoordinator` subclass, `_fire_events()`, `async_get_clientsession()`
|
||||
|
||||
### 5. Refactor sensor.py `[ ]`
|
||||
- Remove Telegram constants, helper functions, and all `_send_telegram_*` methods
|
||||
- Replace with `TelegramClient` from core in `_execute_telegram_notification()`
|
||||
- Keep HA-specific: entity classes, service registration, platform setup
|
||||
|
||||
### 6. Update __init__.py `[ ]`
|
||||
- Update imports for new storage classes (HAStorageBackend adapter)
|
||||
- Create TelegramFileCache instances using core class + HA adapter
|
||||
|
||||
### 7. Verify `[ ]`
|
||||
- All existing entities, services, and events work identically
|
||||
- Telegram notifications work with caching
|
||||
- Quiet hours queueing works
|
||||
- No HA import in core library (verify with grep)
|
||||
|
||||
---
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Integration imports and delegates to `immich_watcher_core`
|
||||
- [ ] Zero behavior changes for end users
|
||||
- [ ] No duplicated logic between core and integration
|
||||
- [ ] Core library has no HA imports (verified)
|
||||
- [ ] `ImmichAlbumStorage` is the only storage class still HA-native
|
||||
@@ -179,7 +179,7 @@ async def _execute_telegram_notification(self, ...):
|
||||
- Write unit tests for all extracted modules
|
||||
- **Subplan**: `plans/phase-1-core-library.md`
|
||||
|
||||
### Phase 2: Wire Core into HAOS Integration `[ ]`
|
||||
### Phase 2: Wire Core into HAOS Integration `[x]`
|
||||
- Move integration to `packages/haos/`
|
||||
- Refactor coordinator.py, sensor.py, storage.py to use core library
|
||||
- Update manifest.json, hacs.json for new structure
|
||||
|
||||
Reference in New Issue
Block a user