Add dag_elements to the get_run endpoint render call to match
what the detail.html template expects.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Pass recipe_name through create_run to display friendly names
- Update templates to show name instead of hash
- Fall back to truncated hash if no name available
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add format detection that correctly handles ; comments
- Import artdag.sexp parser/compiler with YAML fallback
- Add execute_step_sexp and run_plan_sexp Celery tasks
- Update recipe upload to handle both S-expr and YAML formats
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Extract username from actor_id format (@user@server)
- Set total_steps and executed from recipe nodes
- Use recipe name for display instead of hash
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Check multiple locations for nodes (nodes, dag.nodes, pipeline, steps)
and compute step_count for display in recipe list.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Build artifacts list from run output_hash and detect media type
for display in the artifacts tab.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
API clients like Python requests send Accept: */* which wasn't
matching wants_json(). Switch to checking wants_html() instead
so API clients get JSON by default.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
When a recipe run completes, save the output to the user's media
with description and source tracking.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Render HTML template for run detail (not just JSON)
- Get recipe name from pending_runs instead of hardcoding "dag"
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
When a DAG task completes, look up actor_id from pending_runs
(where it was saved when the run started) and include it in
run_cache. Also clean up pending_runs entry after completion.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Inputs stored in old Redis format are JSON strings - this helper
ensures they're always returned as lists regardless of source.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
JSONB columns may return strings in some cases - explicitly parse
inputs field to ensure it's always a list.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Store pending runs in PostgreSQL for durability across restarts
- Add recovery method for orphaned runs
- Increase Celery result_expires to 7 days
- Add task_reject_on_worker_lost for automatic re-queuing
- Add logging to recipe list to debug filter issues
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Check multiple locations for nodes: dag.nodes, recipe.nodes, pipeline, steps
- Add dagre layout libraries for cytoscape DAG visualization
- Fix inputs parsing when stored as JSON string in Redis
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Normalize Celery status names (started -> running)
- Store full run metadata in Redis for pending runs (recipe, inputs, actor_id)
- Filter pending runs by actor_id so users only see their own
- Parse both old and new Redis task data formats for compatibility
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Added /download/client endpoint to serve the CLI client tarball
- Added "Client" link to navigation in base template
- Created build-client.sh script to clone and package the client
- Updated Dockerfile to run build-client.sh during container build
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
SOURCE nodes with config.asset now get content_hash from registry.
EFFECT nodes with config.effect now get effect_hash from registry.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Updated cache detail button from "Publish to IPFS" to "Share to L2"
- Added Share to L2 button on recipe detail page
- Added Share to L2 button on run detail page
- Created /recipes/{id}/publish endpoint
- Created /runs/{id}/publish endpoint (publishes run output)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The template expects recipe.steps and recipe.yaml but the recipe
data has dag.nodes. Convert nodes (list or dict) to steps format
and add YAML dump for source display.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Removed /run/{id} and /recipe/{id} redirect routes
- Updated templates to use /runs/ and /recipes/ paths
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The /run/{id} and /recipe/{id} redirects were calling route handlers
directly without passing the required service dependencies.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Made recipe and inputs optional in RunStatus model
- Convert DAG nodes from list format to dict format when running recipes
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Updated requirements.txt to use art-common@11aa056 with l2_server field
- All routers now import UserContext from artdag_common
- Removed duplicate UserContext from auth_service.py
- dependencies.py sets l2_server from settings on user context
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Fixes AttributeError when running recipes - the UserContext was
missing the l2_server field that run_service expects.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Template used recipe.id but service returns recipe.recipe_id
- Add recipe count to home page stats
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The CLI client sends multipart file uploads but the server expected JSON.
Changed the /recipes/upload endpoint to accept UploadFile and return
the additional fields (name, version, variable_inputs, fixed_inputs)
that the client expects.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Detect actual MIME type from file content and store it instead of
generic "media" type. This enables proper media categorization
and filtering in the UI.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Use content_hash instead of hash
- Use type instead of media_type
- Show filename instead of size_bytes
- Detect media type from both type field and filename extension
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
New version defaults actor_id to @username when not in token,
and supports both artdag_session and auth_token cookies.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
list_cache_items doesn't accept actor_id parameter.
Use get_user_items which properly filters by actor_id and item_type.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Process all node_results after DAG execution
- Store each intermediate/effect output in cache_manager
- Upload all node outputs to IPFS (not just final output)
- Track node_hashes and node_ipfs_cids mappings
- Save run result to database with run_id
- Include nodes with content_hash + ipfs_cid in provenance
- Return node_hashes and node_ipfs_cids in task result
All DAG nodes are now content-addressable via /cache/{content_hash}
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Recipes: Now content-addressed only (cache + IPFS), removed Redis storage
- Runs: Completed runs stored in PostgreSQL, Redis only for task_id mapping
- Add list_runs_by_actor() to database.py for paginated run queries
- Add list_by_type() to cache_manager for filtering by node_type
- Fix upload endpoint to return size and filename fields
- Fix recipe run endpoint with proper DAG input binding
- Fix get_run_service() dependency to pass database module
Storage architecture:
- Redis: Ephemeral only (sessions, task mappings with TTL)
- PostgreSQL: Permanent records (completed runs, metadata)
- Cache: Content-addressed files (recipes, media, outputs)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>