Files
rose-ash/docs/isomorphic-sx-plan.md
giles e6cada972e
All checks were successful
Build and Deploy / build-and-deploy (push) Successful in 2m38s
Add Plans section to SX docs with isomorphic architecture roadmap
New top-level nav section at /plans/ with the 6-phase isomorphic
architecture plan: component distribution, smart boundary, SPA routing,
client IO bridge, streaming suspense, and full isomorphism.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 09:21:24 +00:00

14 KiB

SX Isomorphic Architecture Roadmap

Context

SX has a working server-client pipeline: server evaluates pages with IO (DB, fragments), serializes as SX wire format, client parses and renders to DOM. The language and primitives are already isomorphic — same spec, same semantics, both sides. What's missing is the plumbing that makes the boundary between server and client a sliding window rather than a fixed wall.

The key insight: s-expressions can partially unfold on the server after IO, then finish unfolding on the client. The system should be clever enough to know which downstream components have data fetches, resolve those server-side, and send the rest as pure SX for client rendering. Eventually, the client can also do IO (mapping server DB queries to REST calls), handle routing (SPA), and even work offline with cached data.

Current State (what's solid)

  • Primitive parity: 100%. ~80 pure primitives, same names/semantics, JS and Python.
  • eval/parse/render: Complete both sides. sx-ref.js has eval, parse, render-to-html, render-to-dom, aser.
  • Engine: engine.sx (morph, swaps, triggers, history), orchestration.sx (fetch, events), boot.sx (hydration) — all transpiled.
  • Wire format: Server _aser → SX source → client parses → renders to DOM. Boundary is clean.
  • Component caching: Hash-based localStorage for component definitions and style dictionaries.
  • CSS on-demand: CSSX resolves keywords to CSS rules, injects only used rules.
  • Boundary enforcement: boundary.sx + SX_BOUNDARY_STRICT=1 validates all primitives/IO/helpers at registration.

Architecture Phases


Phase 1: Component Distribution & Dependency Analysis

What it enables: Per-page component bundles instead of sending every definition to every page. Smaller payloads, faster boot, better cache hit rates.

The problem: client_components_tag() in shared/sx/jinja_bridge.py serializes ALL entries in _COMPONENT_ENV. The sx_page() template sends everything or nothing based on a single global hash. No mechanism determines which components a page actually needs.

Approach:

  1. Transitive closure analyzer — new module shared/sx/deps.py

    • Walk Component.body AST, collect all Symbol refs starting with ~
    • Recursively follow into their bodies
    • Handle control forms (if/when/cond/case) — include ALL branches
    • Handle macros — expand during walk using limited eval
    • Function: transitive_deps(name: str, env: dict) -> set[str]
    • Cache result on Component object (invalidate on hot-reload)
  2. Runtime component scanning — after _aser serializes page content, scan the SX string for (~name patterns (parallel to existing scan_classes_from_sx for CSS). Then compute transitive closure to get sub-components.

  3. Per-page component block in sx_page() — replace all-or-nothing with page-specific bundle. Hash changes per page, localStorage cache keyed by route pattern.

  4. SX partial responsescomponents_for_request() already diffs against SX-Components header. Enhance with transitive closure so only truly needed missing components are sent.

Files:

  • New: shared/sx/deps.py — dependency analysis
  • shared/sx/jinja_bridge.py — per-page bundle generation, cache deps on Component
  • shared/sx/helpers.py — modify sx_page() and sx_response() for page-specific bundles
  • shared/sx/types.py — add deps: set[str] to Component
  • shared/sx/ref/boot.sx — per-page component caching alongside global cache

Verification:

  • Page using 5/50 components → data-components block contains only those 5 + transitive deps
  • No "Unknown component" errors after bundle reduction
  • Payload size reduction measurable

Phase 2: Smart Server/Client Boundary

What it enables: Formalized partial evaluation model. Server evaluates IO, serializes pure subtrees. The system automatically knows "this component needs server data" vs "this component is pure and can render anywhere."

Current mechanism: _aser in async_eval.py already does partial evaluation — IO primitives are awaited and substituted, HTML tags and component calls serialize as SX. The _expand_components context var controls expansion. But this is a global toggle, not per-component.

Approach:

  1. Automatic IO detection — extend Phase 1 AST walker to check for references to IO_PRIMITIVES names (frag, query, service, current-user, etc.)

    • has_io_deps(name: str, env: dict) -> bool
    • Computed at registration time, cached on Component
  2. Component metadata — enrich Component with analysis results:

    ComponentMeta:
      deps: set[str]        # transitive component deps (Phase 1)
      io_refs: set[str]     # IO primitive names referenced
      is_pure: bool         # True if io_refs empty (transitively)
    
  3. Selective expansion — refine _aser (line ~1335): instead of checking a global _expand_components flag, check the component's is_pure metadata:

    • IO-dependent → expand server-side (IO must resolve)
    • Pure → serialize for client (let client render)
    • Explicit override: :server true on defcomp forces server expansion
  4. Data manifest for pages — PageDef produces a declaration of what IO the page needs, enabling Phase 3 (client can prefetch data) and Phase 5 (streaming).

Files:

  • shared/sx/deps.py — add IO analysis
  • shared/sx/types.py — add metadata fields to Component
  • shared/sx/async_eval.py — refine _aser component expansion logic
  • shared/sx/jinja_bridge.py — compute IO metadata at registration
  • shared/sx/pages.py — data manifest on PageDef

Verification:

  • Components calling (query ...) classified IO-dependent; pure components classified pure
  • Existing pages produce identical output (regression)

Phase 3: Client-Side Routing (SPA Mode)

What it enables: After initial page load, client resolves routes locally using cached components + data. Only hits server for fresh data or unknown routes. Like Next.js client-side navigation.

Current mechanism: All routing is server-side via defpage → Quart routes. Client navigates via sx-boost links doing sx-get + morphing. Every navigation = server roundtrip.

Approach:

  1. Client-side page registry — serialize defpage routing info to client as <script type="text/sx-pages">:

    {"docs-page": {"path": "/docs/:slug", "auth": "public",
                    "content": "(case slug ...)", "data": null}}
    

    Pure pages (no :data) can be evaluated entirely client-side.

  2. Client route matcher — new spec file shared/sx/ref/router.sx:

    • Convert /docs/<slug> patterns to matchers
    • On boost-link click: match URL → if found and pure, evaluate locally
    • If IO needed: fetch data from server, evaluate content locally
    • No match: fall through to standard fetch (existing behavior)
  3. Data endpointGET /internal/page-data/<page-name>?<params> returns JSON with evaluated :data expression. Reuses execute_page() logic but stops after :data step.

  4. Layout caching — layouts depend on auth/fragments, so cache current layout and reuse across navigations. SX-Layout-Hash header tracks staleness.

  5. Integration with orchestration.sx — intercept bind-boost-link to try client-side resolution first.

Files:

  • shared/sx/pages.pyserialize_for_client(), data-only execution path
  • shared/sx/helpers.py — include page registry in sx_page()
  • New: shared/sx/ref/router.sx — client-side route matching
  • shared/sx/ref/boot.sx — process <script type="text/sx-pages">
  • shared/sx/ref/orchestration.sx — client-side route intercept
  • Service blueprints — /internal/page-data/ endpoint

Depends on: Phase 1 (client knows which components each page needs), Phase 2 (which pages are pure vs IO)

Verification:

  • Pure page navigation: zero server requests
  • IO page navigation: exactly one data request (not full page fetch)
  • Browser back/forward works with client-resolved routes
  • Disabling client registry → identical behavior to current

Phase 4: Client Async & IO Bridge

What it enables: Client evaluates IO primitives by mapping them to server REST calls. Same SX code, different transport. (query "market" "products" :ids "1,2,3") on server → DB; on client → fetch("/internal/data/products?ids=1,2,3").

Approach:

  1. Async client evaluator — two possible mechanisms:

    • Promise-based: evalExpr returns value or Promise; rendering awaits
    • Continuation-based: use existing shift/reset to suspend on IO, resume when data arrives (architecturally cleaner, leverages existing spec)
  2. IO primitive bridge — register async IO primitives in client PRIMITIVES:

    • query → fetch to /internal/data/
    • service → fetch to target service internal endpoint
    • frag → fetch fragment HTML
    • current-user → cached from initial page load
  3. Client data cache — keyed by (service, query, params-hash), configurable TTL, server can invalidate via SX-Invalidate header.

  4. Optimistic updates — extend existing apply-optimistic/revert-optimistic in engine.sx from DOM-level to data-level.

Files:

  • shared/sx/ref/eval.sx — async dispatch path (or new async-eval.sx)
  • New: shared/sx/ref/io-bridge.sx — client IO implementations
  • shared/sx/ref/boot.sx — register IO bridge at init
  • shared/sx/ref/bootstrap_js.py — emit async-aware code
  • /internal/data/ endpoints — ensure client-accessible (CORS, auth)

Depends on: Phase 2 (IO affinity), Phase 3 (routing for when to trigger IO)

Verification:

  • Client (query ...) returns identical data to server-side
  • Data cache prevents redundant fetches
  • Same component source → identical output on either side

Phase 5: Streaming & Suspense

What it enables: Server streams partially-evaluated SX as IO resolves. Client renders available subtrees immediately, fills in suspended parts. Like React Suspense but built on delimited continuations.

Approach:

  1. Continuation-based suspension — when _aser encounters IO during slot evaluation, emit a placeholder with a suspension ID, schedule async resolution:

    yield SxExpr(f'(~suspense :id "{placeholder_id}" :fallback (div "Loading..."))')
    schedule_fill(placeholder_id, io_coroutine)
    
  2. Chunked transfer — Quart async generator responses:

    • First chunk: HTML shell + synchronous content + placeholders
    • Subsequent chunks: <script> tags replacing placeholders with resolved content
  3. Client suspension rendering~suspense component renders fallback, listens for resolution via inline script or SSE (existing SSE infrastructure in orchestration.sx).

  4. Priority-based IO — above-fold content resolves first. All IO starts concurrently (asyncio.create_task), results flushed in priority order.

Files:

  • shared/sx/async_eval.py — streaming _aser variant
  • shared/sx/helpers.py — chunked response builder
  • New: shared/sx/ref/suspense.sx — client suspension rendering
  • shared/sx/ref/boot.sx — handle resolution scripts

Depends on: Phase 4 (client async for filling suspended subtrees), Phase 2 (IO analysis for priority)


Phase 6: Full Isomorphism

What it enables: Same SX code runs on either side. Runtime chooses optimal split. Offline-first with cached data + client eval.

Approach:

  1. Runtime boundary optimizer — given component tree + IO dependency graph, decide per-component: server-expand, client-render, or stream. Planning step cached at registration, recomputed on component change.

  2. Affinity annotations — optional developer hints:

    (defcomp ~product-grid (&key products)
      :affinity :client  ;; interactive, prefer client
      ...)
    (defcomp ~auth-menu (&key user)
      :affinity :server  ;; auth-sensitive, always server
      ...)
    

    Default: auto (runtime decides from IO analysis).

  3. Offline data layer — Service Worker intercepts /internal/data/ requests, serves from IndexedDB when offline, syncs when back online.

  4. Isomorphic testing — evaluate same component on Python and JS, compare output. Extends existing test_sx_ref.py cross-evaluator comparison.

  5. Universal page descriptordefpage is portable: server executes via execute_page(), client executes via route match → fetch data → eval content → render DOM. Same descriptor, different execution environment.

Depends on: All previous phases.


Cross-Cutting Concerns

Error Reporting (all phases)

  • Phase 1: "Unknown component" includes which page expected it and what bundle was sent
  • Phase 2: Server logs which components expanded server-side vs sent to client
  • Phase 3: Client route failures include unmatched path and available routes
  • Phase 4: Client IO errors include query name, params, server response
  • Source location tracking in parser → propagate through eval → include in error messages

Backward Compatibility (all phases)

  • Pages without annotations behave as today
  • SX-Request / SX-Components / SX-Css header protocol continues
  • Existing .sx files require no changes
  • _expand_components continues as override
  • Each phase is opt-in: disable → identical to previous behavior

Spec Integrity

All new behavior specified in .sx files under shared/sx/ref/ before implementation. Bootstrappers transpile from spec. This ensures JS and Python stay in sync.

Critical Files

File Role Phases
shared/sx/async_eval.py Core evaluator, _aser, server/client boundary 2, 5
shared/sx/helpers.py sx_page(), sx_response(), output pipeline 1, 3
shared/sx/jinja_bridge.py _COMPONENT_ENV, component registry 1, 2
shared/sx/pages.py defpage, execute_page(), page lifecycle 2, 3
shared/sx/ref/boot.sx Client boot, component caching 1, 3, 4
shared/sx/ref/orchestration.sx Client fetch/swap/morph 3, 4
shared/sx/ref/eval.sx Evaluator spec 4
shared/sx/ref/engine.sx Morph, swaps, triggers 3
New: shared/sx/deps.py Dependency analysis 1, 2
New: shared/sx/ref/router.sx Client-side routing 3
New: shared/sx/ref/io-bridge.sx Client IO primitives 4
New: shared/sx/ref/suspense.sx Streaming/suspension 5