Files
celery/celery_app.py
gilesb f7890dd1ad Add 3-phase execution with IPFS cache and hash-based task claiming
New files:
- claiming.py - Redis Lua scripts for atomic task claiming
- tasks/analyze.py - Analysis Celery task
- tasks/execute.py - Step execution with IPFS-backed cache
- tasks/orchestrate.py - Plan orchestration (run_plan, run_recipe)

New API endpoints (/api/v2/):
- POST /api/v2/plan - Generate execution plan
- POST /api/v2/execute - Execute a plan
- POST /api/v2/run-recipe - Full 3-phase pipeline
- GET /api/v2/run/{run_id} - Get run status

Features:
- Hash-based task claiming prevents duplicate work
- Parallel execution within dependency levels
- IPFS-backed cache for durability
- Integration with artdag planning module

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-10 11:44:00 +00:00

35 lines
786 B
Python

"""
Art DAG Celery Application
Distributed rendering for the Art DAG system.
Uses the foundational artdag language from GitHub.
"""
import os
from celery import Celery
REDIS_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/5')
app = Celery(
'art_celery',
broker=REDIS_URL,
backend=REDIS_URL,
include=['tasks', 'tasks.analyze', 'tasks.execute', 'tasks.orchestrate']
)
app.conf.update(
result_expires=3600,
task_serializer='json',
accept_content=['json', 'pickle'], # pickle needed for internal Celery messages
result_serializer='json',
event_serializer='json',
timezone='UTC',
enable_utc=True,
task_track_started=True,
task_acks_late=True,
worker_prefetch_multiplier=1,
)
if __name__ == '__main__':
app.start()