Add documentation routes and update README
- Update README with comprehensive documentation covering IPFS-primary mode, 3-phase execution, storage providers, and all API endpoints - Add /docs routes to serve markdown documentation as styled HTML - Include common library documentation in web interface 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
384
README.md
384
README.md
@@ -1,212 +1,312 @@
|
||||
# Art Celery
|
||||
# Art DAG L1 Server
|
||||
|
||||
L1 rendering server for the Art DAG system. Manages distributed rendering jobs via Celery workers.
|
||||
L1 rendering server for the Art DAG system. Manages distributed rendering jobs via Celery workers with content-addressable caching and optional IPFS integration.
|
||||
|
||||
## Features
|
||||
|
||||
- **3-Phase Execution**: Analyze → Plan → Execute pipeline for recipe-based rendering
|
||||
- **Content-Addressable Caching**: SHA3-256 hashed content with deduplication
|
||||
- **IPFS Integration**: Optional IPFS-primary mode for distributed storage
|
||||
- **Storage Providers**: S3, IPFS, and local storage backends
|
||||
- **DAG Visualization**: Interactive graph visualization of execution plans
|
||||
- **SPA-Style Navigation**: Smooth URL-based navigation without full page reloads
|
||||
- **L2 Federation**: Publish outputs to ActivityPub registry
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **artdag** (GitHub): Core DAG execution engine
|
||||
- **artdag-effects** (rose-ash): Effect implementations
|
||||
- **artdag-common**: Shared templates and middleware
|
||||
- **Redis**: Message broker, result backend, and run persistence
|
||||
- **PostgreSQL**: Metadata storage
|
||||
- **IPFS** (optional): Distributed content storage
|
||||
|
||||
## Setup
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
# Install Redis
|
||||
sudo apt install redis-server
|
||||
|
||||
# Install Python dependencies
|
||||
# Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Start Redis
|
||||
redis-server
|
||||
|
||||
# Start a worker
|
||||
celery -A celery_app worker --loglevel=info
|
||||
celery -A celery_app worker --loglevel=info -E
|
||||
|
||||
# Start the L1 server
|
||||
python server.py
|
||||
```
|
||||
|
||||
## Web UI
|
||||
## Docker Swarm Deployment
|
||||
|
||||
The server provides a web interface at the root URL:
|
||||
```bash
|
||||
docker stack deploy -c docker-compose.yml artdag
|
||||
```
|
||||
|
||||
The stack includes:
|
||||
- **redis**: Message broker (Redis 7)
|
||||
- **postgres**: Metadata database (PostgreSQL 16)
|
||||
- **ipfs**: IPFS node (Kubo)
|
||||
- **l1-server**: FastAPI web server
|
||||
- **l1-worker**: Celery workers (2 replicas)
|
||||
- **flower**: Celery task monitoring
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `HOST` | `0.0.0.0` | Server bind address |
|
||||
| `PORT` | `8000` | Server port |
|
||||
| `REDIS_URL` | `redis://localhost:6379/5` | Redis connection |
|
||||
| `DATABASE_URL` | `postgresql://artdag:artdag@localhost:5432/artdag` | PostgreSQL connection |
|
||||
| `CACHE_DIR` | `~/.artdag/cache` | Local cache directory |
|
||||
| `IPFS_API` | `/dns/localhost/tcp/5001` | IPFS API multiaddr |
|
||||
| `IPFS_GATEWAY_URL` | `https://ipfs.io/ipfs` | Public IPFS gateway |
|
||||
| `IPFS_PRIMARY` | `false` | Enable IPFS-primary mode |
|
||||
| `L1_PUBLIC_URL` | `http://localhost:8100` | Public URL for redirects |
|
||||
| `L2_SERVER` | - | L2 ActivityPub server URL |
|
||||
| `L2_DOMAIN` | - | L2 domain for federation |
|
||||
| `ARTDAG_CLUSTER_KEY` | - | Cluster key for trust domains |
|
||||
|
||||
### IPFS-Primary Mode
|
||||
|
||||
When `IPFS_PRIMARY=true`, all content is stored on IPFS:
|
||||
- Input files are added to IPFS on upload
|
||||
- Analysis results stored as JSON on IPFS
|
||||
- Execution plans stored on IPFS
|
||||
- Step outputs pinned to IPFS
|
||||
- Local cache becomes a read-through cache
|
||||
|
||||
This enables distributed execution across multiple L1 nodes sharing the same IPFS network.
|
||||
|
||||
## Web UI
|
||||
|
||||
| Path | Description |
|
||||
|------|-------------|
|
||||
| `/` | Home page with server info |
|
||||
| `/runs` | View and manage rendering runs |
|
||||
| `/run/{id}` | Run detail page |
|
||||
| `/run/{id}` | Run detail with tabs: Plan, Analysis, Artifacts |
|
||||
| `/run/{id}/plan` | Interactive DAG visualization |
|
||||
| `/run/{id}/analysis` | Audio/video analysis data |
|
||||
| `/run/{id}/artifacts` | Cached step outputs |
|
||||
| `/recipes` | Browse and run available recipes |
|
||||
| `/recipe/{id}` | Recipe detail page |
|
||||
| `/recipe/{id}/dag` | Recipe DAG visualization |
|
||||
| `/media` | Browse cached media files |
|
||||
| `/storage` | Manage storage providers |
|
||||
| `/auth` | Receive auth token from L2 |
|
||||
| `/auth/revoke` | Revoke a specific token |
|
||||
| `/auth/revoke-user` | Revoke all tokens for a user (called by L2 on logout) |
|
||||
| `/logout` | Log out |
|
||||
| `/download/client` | Download CLI client |
|
||||
|
||||
## Authentication
|
||||
|
||||
L1 servers authenticate users via L2 (the ActivityPub registry). No shared secrets are required.
|
||||
|
||||
### Configuration
|
||||
|
||||
```bash
|
||||
export L1_PUBLIC_URL=https://celery-artdag.rose-ash.com
|
||||
```
|
||||
|
||||
### How it works
|
||||
|
||||
1. User clicks "Attach" on L2's Renderers page
|
||||
2. L2 creates a **scoped token** bound to this specific L1
|
||||
3. User is redirected to L1's `/auth?auth_token=...`
|
||||
4. L1 calls L2's `/auth/verify` to validate the token
|
||||
5. L2 checks: token valid, not revoked, scope matches this L1
|
||||
6. L1 sets a local cookie and records the token
|
||||
|
||||
### Token revocation
|
||||
|
||||
When a user logs out of L2, L2 calls `/auth/revoke-user` on all attached L1s. L1 maintains a Redis-based token tracking and revocation system:
|
||||
|
||||
- Tokens registered per-user when authenticating (`artdag:user_tokens:{username}`)
|
||||
- `/auth/revoke-user` revokes all tokens for a username
|
||||
- Revoked token hashes stored in Redis with 30-day expiry
|
||||
- Every authenticated request checks the revocation list
|
||||
- Revoked tokens are immediately rejected
|
||||
|
||||
### Security
|
||||
|
||||
- **Scoped tokens**: Tokens are bound to a specific L1. A stolen token can't be used on other L1 servers.
|
||||
- **L2 verification**: L1 verifies every token with L2, which checks its revocation table.
|
||||
- **No shared secrets**: L1 doesn't need L2's JWT secret.
|
||||
|
||||
## API
|
||||
## API Reference
|
||||
|
||||
Interactive docs: http://localhost:8100/docs
|
||||
|
||||
### Endpoints
|
||||
### Runs
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/` | Server info |
|
||||
| POST | `/runs` | Start a rendering run |
|
||||
| GET | `/runs` | List all runs |
|
||||
| GET | `/runs` | List all runs (paginated) |
|
||||
| GET | `/runs/{run_id}` | Get run status |
|
||||
| DELETE | `/runs/{run_id}` | Delete a run |
|
||||
| GET | `/cache` | List cached content hashes |
|
||||
| GET | `/cache/{hash}` | Download cached content |
|
||||
| DELETE | `/cache/{hash}` | Delete cached content |
|
||||
| POST | `/cache/import?path=` | Import local file to cache |
|
||||
| POST | `/cache/upload` | Upload file to cache |
|
||||
| GET | `/assets` | List known assets |
|
||||
| POST | `/configs/upload` | Upload a config YAML |
|
||||
| GET | `/configs` | List configs |
|
||||
| GET | `/configs/{id}` | Get config details |
|
||||
| DELETE | `/configs/{id}` | Delete a config |
|
||||
| POST | `/configs/{id}/run` | Run a config |
|
||||
| GET | `/api/run/{run_id}` | Get run as JSON |
|
||||
| GET | `/api/run/{run_id}/plan` | Get execution plan JSON |
|
||||
| GET | `/api/run/{run_id}/analysis` | Get analysis data JSON |
|
||||
|
||||
### Configs
|
||||
### Recipes
|
||||
|
||||
Configs are YAML files that define reusable DAG pipelines. They can have:
|
||||
- **Fixed inputs**: Assets with pre-defined content hashes
|
||||
- **Variable inputs**: Placeholders filled at run time
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/recipes/upload` | Upload recipe YAML |
|
||||
| GET | `/recipes` | List recipes (paginated) |
|
||||
| GET | `/recipes/{recipe_id}` | Get recipe details |
|
||||
| DELETE | `/recipes/{recipe_id}` | Delete recipe |
|
||||
| POST | `/recipes/{recipe_id}/run` | Execute recipe |
|
||||
|
||||
### Cache
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/cache/{hash}` | Get cached content (with preview) |
|
||||
| GET | `/cache/{hash}/raw` | Download raw content |
|
||||
| GET | `/cache/{hash}/mp4` | Get MP4 video |
|
||||
| GET | `/cache/{hash}/meta` | Get content metadata |
|
||||
| PATCH | `/cache/{hash}/meta` | Update metadata |
|
||||
| POST | `/cache/{hash}/publish` | Publish to L2 |
|
||||
| DELETE | `/cache/{hash}` | Delete from cache |
|
||||
| POST | `/cache/import?path=` | Import local file |
|
||||
| POST | `/cache/upload` | Upload file |
|
||||
| GET | `/media` | Browse media gallery |
|
||||
|
||||
### IPFS
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/ipfs/{cid}` | Redirect to IPFS gateway |
|
||||
| GET | `/ipfs/{cid}/raw` | Fetch raw content from IPFS |
|
||||
|
||||
### Storage Providers
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/storage` | List storage providers |
|
||||
| POST | `/storage` | Add provider (form) |
|
||||
| POST | `/storage/add` | Add provider (JSON) |
|
||||
| GET | `/storage/{id}` | Get provider details |
|
||||
| PATCH | `/storage/{id}` | Update provider |
|
||||
| DELETE | `/storage/{id}` | Delete provider |
|
||||
| POST | `/storage/{id}/test` | Test connection |
|
||||
| GET | `/storage/type/{type}` | Get form for provider type |
|
||||
|
||||
### 3-Phase API
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/plan` | Generate execution plan |
|
||||
| POST | `/api/execute` | Execute a plan |
|
||||
| POST | `/api/run-recipe` | Full pipeline (analyze+plan+execute) |
|
||||
|
||||
### Authentication
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/auth` | Receive auth token from L2 |
|
||||
| GET | `/logout` | Log out |
|
||||
| POST | `/auth/revoke` | Revoke a specific token |
|
||||
| POST | `/auth/revoke-user` | Revoke all user tokens |
|
||||
|
||||
## 3-Phase Execution
|
||||
|
||||
Recipes are executed in three phases:
|
||||
|
||||
### Phase 1: Analyze
|
||||
Extract features from input files:
|
||||
- **Audio/Video**: Tempo, beat times, energy levels
|
||||
- Results cached by content hash
|
||||
|
||||
### Phase 2: Plan
|
||||
Generate an execution plan:
|
||||
- Parse recipe YAML
|
||||
- Resolve dependencies between steps
|
||||
- Compute cache IDs for each step
|
||||
- Skip already-cached steps
|
||||
|
||||
### Phase 3: Execute
|
||||
Run the plan level by level:
|
||||
- Steps at each level run in parallel
|
||||
- Results cached with content-addressable hashes
|
||||
- Progress tracked in Redis
|
||||
|
||||
## Recipe Format
|
||||
|
||||
Recipes define reusable DAG pipelines:
|
||||
|
||||
Example config:
|
||||
```yaml
|
||||
name: my-effect
|
||||
name: beat-sync
|
||||
version: "1.0"
|
||||
description: "Apply effect to user image"
|
||||
description: "Synchronize video to audio beats"
|
||||
|
||||
registry:
|
||||
effects:
|
||||
dog:
|
||||
hash: "abc123..."
|
||||
inputs:
|
||||
video:
|
||||
type: video
|
||||
description: "Source video"
|
||||
audio:
|
||||
type: audio
|
||||
description: "Audio track"
|
||||
|
||||
dag:
|
||||
nodes:
|
||||
- id: user_image
|
||||
type: SOURCE
|
||||
config:
|
||||
input: true # Variable input
|
||||
name: "input_image"
|
||||
steps:
|
||||
- id: analyze_audio
|
||||
type: ANALYZE
|
||||
inputs: [audio]
|
||||
config:
|
||||
features: [beats, energy]
|
||||
|
||||
- id: apply_dog
|
||||
type: EFFECT
|
||||
config:
|
||||
effect: dog
|
||||
inputs:
|
||||
- user_image
|
||||
- id: sync_video
|
||||
type: BEAT_SYNC
|
||||
inputs: [video, analyze_audio]
|
||||
config:
|
||||
mode: stretch
|
||||
|
||||
output: apply_dog
|
||||
output: sync_video
|
||||
```
|
||||
|
||||
### Start a run
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8100/runs \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"recipe": "dog", "inputs": ["33268b6e..."], "output_name": "my-output"}'
|
||||
```
|
||||
|
||||
### Check run status
|
||||
|
||||
```bash
|
||||
curl http://localhost:8100/runs/{run_id}
|
||||
```
|
||||
|
||||
### Delete a run
|
||||
|
||||
```bash
|
||||
curl -X DELETE http://localhost:8100/runs/{run_id} \
|
||||
-H "Authorization: Bearer <token>"
|
||||
```
|
||||
|
||||
Note: Failed runs can always be deleted. Completed runs can only be deleted if their outputs haven't been published to L2.
|
||||
|
||||
### Delete cached content
|
||||
|
||||
```bash
|
||||
curl -X DELETE http://localhost:8100/cache/{hash} \
|
||||
-H "Authorization: Bearer <token>"
|
||||
```
|
||||
|
||||
Note: Items that are inputs/outputs of runs, or published to L2, cannot be deleted.
|
||||
|
||||
## Storage
|
||||
|
||||
- **Cache**: `~/.artdag/cache/` (content-addressed files)
|
||||
- **Runs**: Redis db 5, keys `artdag:run:*` (persists across restarts)
|
||||
### Local Cache
|
||||
- Location: `~/.artdag/cache/` (or `CACHE_DIR`)
|
||||
- Content-addressed by SHA3-256 hash
|
||||
- Subdirectories: `plans/`, `analysis/`
|
||||
|
||||
### Redis
|
||||
- Database 5 (configurable via `REDIS_URL`)
|
||||
- Keys:
|
||||
- `artdag:run:*` - Run state
|
||||
- `artdag:recipe:*` - Recipe definitions
|
||||
- `artdag:revoked:*` - Token revocation
|
||||
- `artdag:user_tokens:*` - User token tracking
|
||||
|
||||
### PostgreSQL
|
||||
- Content metadata
|
||||
- Storage provider configurations
|
||||
- Provenance records
|
||||
|
||||
## Authentication
|
||||
|
||||
L1 servers authenticate via L2 (ActivityPub registry). No shared secrets required.
|
||||
|
||||
### Flow
|
||||
1. User clicks "Attach" on L2's Renderers page
|
||||
2. L2 creates a scoped token bound to this L1
|
||||
3. User redirected to L1's `/auth?auth_token=...`
|
||||
4. L1 calls L2's `/auth/verify` to validate
|
||||
5. L1 sets local cookie and records token
|
||||
|
||||
### Token Revocation
|
||||
- Tokens tracked per-user in Redis
|
||||
- L2 calls `/auth/revoke-user` on logout
|
||||
- Revoked hashes stored with 30-day expiry
|
||||
- Every request checks revocation list
|
||||
|
||||
## CLI Usage
|
||||
|
||||
```bash
|
||||
# Render cat through dog effect
|
||||
# Quick render (effect mode)
|
||||
python render.py dog cat --sync
|
||||
|
||||
# Render cat through identity effect
|
||||
python render.py identity cat --sync
|
||||
|
||||
# Submit async (don't wait)
|
||||
# Submit async
|
||||
python render.py dog cat
|
||||
|
||||
# Run a recipe
|
||||
curl -X POST http://localhost:8100/recipes/beat-sync/run \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer <token>" \
|
||||
-d '{"inputs": {"video": "abc123...", "audio": "def456..."}}'
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
server.py (L1 Server - FastAPI)
|
||||
L1 Server (FastAPI)
|
||||
│
|
||||
├── POST /runs → Submit job
|
||||
│ │
|
||||
│ ▼
|
||||
│ celery_app.py (Celery broker)
|
||||
│ │
|
||||
│ ▼
|
||||
│ tasks.py (render_effect task)
|
||||
│ │
|
||||
│ ├── artdag (GitHub) - DAG execution
|
||||
│ └── artdag-effects (rose-ash) - Effects
|
||||
│ │
|
||||
│ ▼
|
||||
│ Output + Provenance
|
||||
├── Web UI (Jinja2 + HTMX + Tailwind)
|
||||
│
|
||||
└── GET /cache/{hash} → Retrieve output
|
||||
├── POST /runs → Celery tasks
|
||||
│ │
|
||||
│ └── celery_app.py
|
||||
│ ├── tasks/analyze.py (Phase 1)
|
||||
│ ├── tasks/execute.py (Phase 3 steps)
|
||||
│ └── tasks/orchestrate.py (Full pipeline)
|
||||
│
|
||||
├── cache_manager.py
|
||||
│ │
|
||||
│ ├── Local filesystem (CACHE_DIR)
|
||||
│ ├── IPFS (ipfs_client.py)
|
||||
│ └── S3/Storage providers
|
||||
│
|
||||
└── database.py (PostgreSQL metadata)
|
||||
```
|
||||
|
||||
## Provenance
|
||||
|
||||
Reference in New Issue
Block a user