Refactor storage: remove Redis duplication, use proper data tiers
- Recipes: Now content-addressed only (cache + IPFS), removed Redis storage - Runs: Completed runs stored in PostgreSQL, Redis only for task_id mapping - Add list_runs_by_actor() to database.py for paginated run queries - Add list_by_type() to cache_manager for filtering by node_type - Fix upload endpoint to return size and filename fields - Fix recipe run endpoint with proper DAG input binding - Fix get_run_service() dependency to pass database module Storage architecture: - Redis: Ephemeral only (sessions, task mappings with TTL) - PostgreSQL: Permanent records (completed runs, metadata) - Cache: Content-addressed files (recipes, media, outputs) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -519,6 +519,22 @@ class L1CacheManager:
|
||||
|
||||
return files
|
||||
|
||||
def list_by_type(self, node_type: str) -> List[str]:
|
||||
"""
|
||||
List content hashes of all cached files of a specific type.
|
||||
|
||||
Args:
|
||||
node_type: Type to filter by (e.g., "recipe", "upload", "effect")
|
||||
|
||||
Returns:
|
||||
List of content hashes
|
||||
"""
|
||||
hashes = []
|
||||
for entry in self.cache.list_entries():
|
||||
if entry.node_type == node_type and entry.content_hash:
|
||||
hashes.append(entry.content_hash)
|
||||
return hashes
|
||||
|
||||
# ============ Activity Tracking ============
|
||||
|
||||
def record_activity(self, dag: DAG, run_id: str = None) -> Activity:
|
||||
|
||||
Reference in New Issue
Block a user