Add macros, declarative handlers (defhandler), and convert all fragment routes to sx

Phase 1 — Macros: defmacro + quasiquote syntax (`, ,, ,@) in parser,
evaluator, HTML renderer, and JS mirror. Macro type, expansion, and
round-trip serialization.

Phase 2 — Expanded primitives: app-url, url-for, asset-url, config,
format-date, parse-int (pure); service, request-arg, request-path,
nav-tree, get-children (I/O); jinja-global, relations-from (pure).
Updated _io_service to accept (service "registry-name" "method" :kwargs)
with auto kebab→snake conversion. DTO-to-dict now expands datetime fields
into year/month/day convenience keys. Tuple returns converted to lists.

Phase 3 — Declarative handlers: HandlerDef type, defhandler special form,
handler registry (service → name → HandlerDef), async evaluator+renderer
(async_eval.py) that awaits I/O primitives inline within control flow.
Handler loading from .sx files, execute_handler, blueprint factory.

Phase 4 — Convert all fragment routes: 13 Python fragment handlers across
8 services replaced with declarative .sx handler files. All routes.py
simplified to uniform sx dispatch pattern. Two Jinja HTML handlers
(events/container-cards, events/account-page) kept as Python.

New files: shared/sx/async_eval.py, shared/sx/handlers.py,
shared/sx/tests/test_handlers.py, plus 13 handler .sx files under
{service}/sx/handlers/. MarketService.product_by_slug() added.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-03-03 00:22:18 +00:00
parent 13bcf755f6
commit ab75e505a8
48 changed files with 2538 additions and 638 deletions

View File

@@ -225,6 +225,20 @@ def _parse_expr(tok: Tokenizer) -> Any:
if raw == "{":
tok.next_token() # consume the '{'
return _parse_map(tok)
# Quasiquote syntax: ` , ,@
if raw == "`":
tok._advance(1) # consume the backtick
inner = _parse_expr(tok)
return [Symbol("quasiquote"), inner]
if raw == ",":
tok._advance(1) # consume the comma
# Check for splice-unquote (,@) — no whitespace between , and @
if tok.pos < len(tok.text) and tok.text[tok.pos] == "@":
tok._advance(1) # consume the @
inner = _parse_expr(tok)
return [Symbol("splice-unquote"), inner]
inner = _parse_expr(tok)
return [Symbol("unquote"), inner]
# Everything else: strings, keywords, symbols, numbers
token = tok.next_token()
return token
@@ -276,6 +290,15 @@ def serialize(expr: Any, indent: int = 0, pretty: bool = False) -> str:
if isinstance(expr, list):
if not expr:
return "()"
# Quasiquote sugar: [Symbol("quasiquote"), x] → `x
if (len(expr) == 2 and isinstance(expr[0], Symbol)):
name = expr[0].name
if name == "quasiquote":
return "`" + serialize(expr[1], indent, pretty)
if name == "unquote":
return "," + serialize(expr[1], indent, pretty)
if name == "splice-unquote":
return ",@" + serialize(expr[1], indent, pretty)
if pretty:
return _serialize_pretty(expr, indent)
items = [serialize(item, indent, False) for item in expr]