New files:
- claiming.py - Redis Lua scripts for atomic task claiming
- tasks/analyze.py - Analysis Celery task
- tasks/execute.py - Step execution with IPFS-backed cache
- tasks/orchestrate.py - Plan orchestration (run_plan, run_recipe)
New API endpoints (/api/v2/):
- POST /api/v2/plan - Generate execution plan
- POST /api/v2/execute - Execute a plan
- POST /api/v2/run-recipe - Full 3-phase pipeline
- GET /api/v2/run/{run_id} - Get run status
Features:
- Hash-based task claiming prevents duplicate work
- Parallel execution within dependency levels
- IPFS-backed cache for durability
- Integration with artdag planning module
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
19 lines
520 B
Python
19 lines
520 B
Python
# art-celery/tasks - Celery tasks for 3-phase execution
|
|
#
|
|
# Tasks for the Art DAG distributed execution system:
|
|
# 1. analyze_input - Extract features from input media
|
|
# 2. execute_step - Execute a single step from the plan
|
|
# 3. run_plan - Orchestrate execution of a full plan
|
|
|
|
from .analyze import analyze_input, analyze_inputs
|
|
from .execute import execute_step
|
|
from .orchestrate import run_plan, run_recipe
|
|
|
|
__all__ = [
|
|
"analyze_input",
|
|
"analyze_inputs",
|
|
"execute_step",
|
|
"run_plan",
|
|
"run_recipe",
|
|
]
|