5 Commits

Author SHA1 Message Date
58f019bc14 JIT: lib/jit.sx — SX-level convenience layer
Some checks failed
Test, Build, and Deploy / test-build-deploy (push) Failing after 41s
Three primitives + a wrapper, all portable across hosts:

  with-jit-threshold N body...  — temporarily set threshold, restore on exit
  with-jit-budget    N body...  — temporarily set LRU budget
  with-fresh-jit       body...  — clear cache before & after body

  jit-report                    — human-readable stats string for logging
  jit-disable!  / jit-enable!   — convenience around set-budget! 0

The host (OCaml here, will be JS/Python eventually) only needs to provide
the underlying primitives (jit-stats, jit-set-threshold!, jit-set-budget!,
jit-reset-cache!, jit-reset-counters!). The ergonomics live in shared SX.

Used together with Phase 1 (tiered compilation) and Phase 2 (LRU eviction)
to give application developers fine-grained control over the JIT cache:
isolated test runs use with-fresh-jit, hot benchmark sections use
with-jit-threshold 1, memory-constrained pages use jit-set-budget! to
cap the cache.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-11 22:26:45 +00:00
1f466186f9 JIT: Phase 2 (LRU eviction) + Phase 3 (manual reset)
Some checks failed
Test, Build, and Deploy / test-build-deploy (push) Failing after 46s
sx_types.ml:
  - Add l_uid field on lambda (unique identity for cache tracking)
  - Add lambda_uid_counter + next_lambda_uid () minted on construction
  - Add jit_budget (default 5000) and jit_evicted_count counter
  - Add jit_cache_queue : (int * value) Queue.t — FIFO of compiled lambdas
  - jit_cache_size () helper for stats

sx_vm.ml:
  - On successful JIT compile, push (uid, Lambda l) onto jit_cache_queue
  - While queue length exceeds jit_budget, pop head (oldest entry) and
    clear that lambda's l_compiled slot — evicted entries fall through
    to cek_call_or_suspend on next call (correct, just slower)
  - Guard JIT trigger by !jit_budget > 0 (budget=0 disables JIT entirely)

sx_primitives.ml:
  Phase 2:
    - jit-set-budget! N — change cache budget at runtime
    - jit-stats includes budget, cache-size, evicted
  Phase 3:
    - jit-reset-cache! — clear all compiled VmClosures (hot paths re-JIT
      on next threshold crossing)
    - jit-reset-counters! also resets evicted counter

run_tests.ml:
  - Update test-fixture lambda construction to include l_uid

Effect: cache size bounded regardless of input pattern. The HS test harness
compiles ~3000 distinct one-shot lambdas, but tiered compilation (Phase 1)
keeps most below threshold so they never enter the cache. Steady-state count
stays in single digits for typical workloads. When a misbehaving caller
saturates the cache (eval-hs in a tight loop, REPL-style host), LRU
eviction caps memory at jit_budget compiled closures × ~1KB each.

Verification: 4771 passed, 1111 failed in run_tests — identical to
pre-Phase-2 baseline. No regressions.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-11 22:22:37 +00:00
29ef89d473 HS: native unwrap sweep — make all 21 host-* natives ABI-compatible
Some checks failed
Test, Build, and Deploy / test-build-deploy (push) Failing after 19s
Following the host-call/host-new precedent, audit the remaining natives
that pass user-supplied values into native JS, and unwrap value handles
({_type, __sx_handle}) at the boundary. Patterns:

  host-global         arg[0]  → string name for globalThis lookup
  host-get            arg[1]  → property key
  host-set!           arg[1]  → property key
                      arg[2]  → value being stored
  host-call           arg[1]  → method name (was missing in initial fix)
                      args... → method arguments
  host-call-fn        argList items → function call arguments
                                      (was sxToJs; now also unwraps atoms)
  host-new            arg[0]  → constructor name
                      args... → constructor arguments
  host-make-js-thrower arg[0] → value to throw (must be primitive in JS)
  host-typeof         arg[0]  → recognize wrapped handles and report their
                                underlying type instead of "object"
  host-iter?          arg[0]  → object to test for [Symbol.iterator]
  host-to-list        arg[0]  → object to spread
  host-new-function   args    → param-name strings and body string

All wraps are forward-compatible: _unwrapHandle is a no-op on plain values
returned by the legacy kernel. The shim activates only when the runtime
encounters real wrapped handles from the new kernel.

Verification — 100 tests pass on the new WASM after sweep (test 27
'can append a value to a set' previously broken by Set value-handle
aliasing now resolves correctly).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-11 21:46:14 +00:00
f12c19eaa3 HS: test runner — unwrap value handles before native interop
Some checks failed
Test, Build, and Deploy / test-build-deploy (push) Failing after 22s
The new kernel ABI wraps atoms (number, string, boolean, nil) in opaque
handles {_type, __sx_handle}. When such handles flow through host-call
into native JS functions, value equality breaks: each integer literal
becomes a unique handle object, so JS Set.add(handle_for_1) does NOT
dedup against a prior set.add(handle_for_1). Same problem for any JS
API that uses identity or value equality on incoming arguments.

Fix: add _unwrapHandle that converts handles back to JS primitives via
K.stringify, and apply it to argument lists in host-call and host-new
(the two natives that pass user values into native JS constructors /
methods). Forward-compatible: no-op when called with already-unwrapped
plain values from the legacy kernel.

Root-cause analysis traced through:
  1. Test 27 'can append a value to a set' failed (Expected 3, got 4)
     on the new WASM only. Set was admitting duplicates.
  2. dbg-set.js minimal repro confirmed each `1` literal arriving at
     set.add as a different {_type, __sx_handle} object.
  3. JS Set.add uses SameValueZero — handle objects with the same
     underlying value are still distinct identity.
  4. Unwrapping in host-call/host-new resolves the equality issue.

This is preparation for the JIT Phase 1 WASM rollout (which still
needs more native-interop unwrap audits before it can replace the
pre-merge WASM that the test tree currently pins).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-11 21:04:30 +00:00
6e997e9382 HS: test runner — auto-unwrap shim for new WASM kernel ABI
Some checks failed
Test, Build, and Deploy / test-build-deploy (push) Failing after 26s
Post-JIT-Phase-1 OCaml kernels return atomic values (number, string,
boolean, nil) as opaque handles {_type, __sx_handle} instead of plain
JS values. The 23 K.eval call sites in hs-run-filtered.js were written
against the pre-rewrite ABI and expect plain values.

Add a wrapper at boot that auto-unwraps via K.stringify when the result
is a handle. No-op on the legacy kernel (handles don't appear, so the
check falls through). Forward-compatible: when the new WASM is the
default, the shim transparently restores test compatibility.

Note: This unblocks future browser-WASM rollout of JIT Phase 1. A
separate issue (Set-append size regression — Expected 3, got 4 on
test 27) in newer architecture-branch kernel changes still blocks the
WASM rollout; the test tree continues to pin the pre-merge WASM until
that regression is identified and fixed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-11 20:30:32 +00:00
6 changed files with 215 additions and 22 deletions

View File

@@ -1279,7 +1279,7 @@ let run_foundation_tests () =
assert_true "sx_truthy \"\"" (Bool (sx_truthy (String "")));
assert_eq "not truthy nil" (Bool false) (Bool (sx_truthy Nil));
assert_eq "not truthy false" (Bool false) (Bool (sx_truthy (Bool false)));
let l = { l_params = ["x"]; l_body = Symbol "x"; l_closure = Sx_types.make_env (); l_name = None; l_compiled = None; l_call_count = 0 } in
let l = { l_params = ["x"]; l_body = Symbol "x"; l_closure = Sx_types.make_env (); l_name = None; l_compiled = None; l_call_count = 0; l_uid = Sx_types.next_lambda_uid () } in
assert_true "is_lambda" (Bool (Sx_types.is_lambda (Lambda l)));
ignore (Sx_types.set_lambda_name (Lambda l) "my-fn");
assert_eq "lambda name mutated" (String "my-fn") (lambda_name (Lambda l))

View File

@@ -3146,17 +3146,34 @@ let () =
register "jit-stats" (fun _args ->
let d = Hashtbl.create 8 in
Hashtbl.replace d "threshold" (Number (float_of_int !Sx_types.jit_threshold));
Hashtbl.replace d "budget" (Number (float_of_int !Sx_types.jit_budget));
Hashtbl.replace d "cache-size" (Number (float_of_int (Sx_types.jit_cache_size ())));
Hashtbl.replace d "compiled" (Number (float_of_int !Sx_types.jit_compiled_count));
Hashtbl.replace d "compile-failed" (Number (float_of_int !Sx_types.jit_skipped_count));
Hashtbl.replace d "below-threshold" (Number (float_of_int !Sx_types.jit_threshold_skipped_count));
Hashtbl.replace d "evicted" (Number (float_of_int !Sx_types.jit_evicted_count));
Dict d);
register "jit-set-threshold!" (fun args ->
match args with
| [Number n] -> Sx_types.jit_threshold := int_of_float n; Nil
| [Integer n] -> Sx_types.jit_threshold := n; Nil
| _ -> raise (Eval_error "jit-set-threshold!: (n) where n is integer"));
register "jit-set-budget!" (fun args ->
match args with
| [Number n] -> Sx_types.jit_budget := int_of_float n; Nil
| [Integer n] -> Sx_types.jit_budget := n; Nil
| _ -> raise (Eval_error "jit-set-budget!: (n) where n is integer"));
register "jit-reset-cache!" (fun _args ->
(* Phase 3 manual cache reset — clear all compiled VmClosures.
Hot paths will re-JIT on next call (after re-hitting threshold). *)
Queue.iter (fun (_, v) ->
match v with Lambda l -> l.l_compiled <- None | _ -> ()
) Sx_types.jit_cache_queue;
Queue.clear Sx_types.jit_cache_queue;
Nil);
register "jit-reset-counters!" (fun _args ->
Sx_types.jit_compiled_count := 0;
Sx_types.jit_skipped_count := 0;
Sx_types.jit_threshold_skipped_count := 0;
Sx_types.jit_evicted_count := 0;
Nil)

View File

@@ -129,6 +129,7 @@ and lambda = {
mutable l_name : string option;
mutable l_compiled : vm_closure option; (** Lazy JIT cache *)
mutable l_call_count : int; (** Tiered-compilation counter — JIT after threshold calls *)
l_uid : int; (** Unique identity for LRU cache tracking *)
}
and component = {
@@ -435,12 +436,16 @@ let unwrap_env_val = function
| Env e -> e
| _ -> raise (Eval_error "make_lambda: expected env for closure")
(* Lambda UID — minted on construction, used as LRU cache key (Phase 2). *)
let lambda_uid_counter = ref 0
let next_lambda_uid () = incr lambda_uid_counter; !lambda_uid_counter
let make_lambda params body closure =
let ps = match params with
| List items -> List.map value_to_string items
| _ -> value_to_string_list params
in
Lambda { l_params = ps; l_body = body; l_closure = unwrap_env_val closure; l_name = None; l_compiled = None; l_call_count = 0 }
Lambda { l_params = ps; l_body = body; l_closure = unwrap_env_val closure; l_name = None; l_compiled = None; l_call_count = 0; l_uid = next_lambda_uid () }
(** {1 JIT cache control}
@@ -455,6 +460,37 @@ let jit_compiled_count = ref 0
let jit_skipped_count = ref 0
let jit_threshold_skipped_count = ref 0
(** {2 JIT cache LRU eviction — Phase 2}
Once a lambda crosses the threshold, its [l_compiled] slot is filled.
To bound memory under unbounded compilation pressure, track all live
compiled lambdas in FIFO order, and evict from the head when the count
exceeds [jit_budget].
[lambda_uid_counter] mints unique identities on lambda creation; the
LRU queue holds these IDs paired with a back-reference to the lambda
so we can clear its [l_compiled] slot on eviction.
Budget of 0 = no cache (disable JIT entirely).
Budget of [max_int] = unbounded (legacy behaviour). Default 5000 is
a generous ceiling for any realistic page; the test harness compiles
~3000 distinct one-shot lambdas in a full run but tiered compilation
(Phase 1) means most never enter the cache, so steady-state count
stays small.
[lambda_uid_counter] and [next_lambda_uid] are defined above
[make_lambda] (which uses them on construction). *)
let jit_budget = ref 5000
let jit_evicted_count = ref 0
(** Live compiled lambdas in FIFO order — front is oldest, back is newest.
Each entry is (uid, lambda); on eviction we clear lambda.l_compiled and
drop from the queue. Using a mutable Queue rather than a hand-rolled
linked list because eviction is amortised O(1) at the head and inserts
are O(1) at the tail. *)
let jit_cache_queue : (int * value) Queue.t = Queue.create ()
let jit_cache_size () = Queue.length jit_cache_queue
let make_component name params has_children body closure affinity =
let n = value_to_string name in
let ps = value_to_string_list params in

View File

@@ -357,12 +357,20 @@ and vm_call vm f args =
if l.l_name <> None
then begin
l.l_call_count <- l.l_call_count + 1;
if l.l_call_count >= !Sx_types.jit_threshold then begin
if l.l_call_count >= !Sx_types.jit_threshold && !Sx_types.jit_budget > 0 then begin
l.l_compiled <- Some jit_failed_sentinel;
match !jit_compile_ref l vm.globals with
| Some cl ->
incr Sx_types.jit_compiled_count;
l.l_compiled <- Some cl;
(* Phase 2 LRU: track this compiled lambda; if cache exceeds budget,
evict the oldest by clearing its l_compiled slot. *)
Queue.add (l.l_uid, Lambda l) Sx_types.jit_cache_queue;
while Queue.length Sx_types.jit_cache_queue > !Sx_types.jit_budget do
(match Queue.pop Sx_types.jit_cache_queue with
| (_, Lambda ev_l) -> ev_l.l_compiled <- None; incr Sx_types.jit_evicted_count
| _ -> ())
done;
push_closure_frame vm cl args
| None ->
incr Sx_types.jit_skipped_count;

89
lib/jit.sx Normal file
View File

@@ -0,0 +1,89 @@
;; lib/jit.sx — SX-level convenience wrappers over the JIT cache control
;; primitives (jit-stats, jit-set-threshold!, jit-set-budget!, jit-reset-cache!,
;; jit-reset-counters!). Host-specific implementations live in
;; hosts/<host>/lib/sx_*.ml; the API surface is portable across hosts.
;; with-jit-threshold — temporarily set the JIT call-count threshold for
;; the duration of body, restoring the previous value on exit. Useful for
;; sections that want eager compilation (threshold=1) or want to skip JIT
;; entirely (threshold=999999) for diagnostic comparison.
(defmacro
with-jit-threshold
(n &rest body)
`(let
((__old (get (jit-stats) "threshold")))
(jit-set-threshold! ,n)
(let
((__r (do ,@body)))
(jit-set-threshold! __old)
__r)))
;; with-jit-budget — temporarily set the LRU cache budget. Setting to 0
;; disables JIT entirely (everything falls through to the interpreter);
;; large values are effectively unbounded.
(defmacro
with-jit-budget
(n &rest body)
`(let
((__old (get (jit-stats) "budget")))
(jit-set-budget! ,n)
(let
((__r (do ,@body)))
(jit-set-budget! __old)
__r)))
;; with-fresh-jit — clear the cache before body, run body, clear again
;; after. Use between sessions / request batches / test suites where you
;; want deterministic timing free of carryover.
(defmacro
with-fresh-jit
(&rest body)
`(let
((__r (do (jit-reset-cache!) ,@body)))
(jit-reset-cache!)
__r))
;; jit-report — human-readable summary of current JIT state. Returns a
;; string suitable for logging.
(define
jit-report
(fn
()
(let
((s (jit-stats)))
(let
((compiled (get s "compiled"))
(skipped (get s "below-threshold"))
(failed (get s "compile-failed"))
(evicted (get s "evicted"))
(cache-size (get s "cache-size"))
(budget (get s "budget"))
(threshold (get s "threshold")))
(let
((total (+ compiled skipped failed)))
(str
"jit: " cache-size "/" budget " cached "
"(thr=" threshold ") · "
compiled " compiled, "
skipped " below-thr, "
failed " failed, "
evicted " evicted "
"(" (if (> total 0) (* 100 (/ compiled total)) 0) "% compile rate)"))))))
;; jit-disable! / jit-enable! — convenience helpers. Disabling sets budget
;; to 0 which causes the VM to skip JIT entirely on the next call. Enable
;; restores the budget to its previous value (or 5000 if no previous).
(define _jit-saved-budget (list 5000))
(define
jit-disable!
(fn
()
(set! _jit-saved-budget (list (get (jit-stats) "budget")))
(jit-set-budget! 0)))
(define
jit-enable!
(fn
()
(jit-set-budget! (first _jit-saved-budget))))

View File

@@ -14,6 +14,48 @@ const SX_DIR = path.join(WASM_DIR, 'sx');
eval(fs.readFileSync(path.join(WASM_DIR, 'sx_browser.bc.js'), 'utf8'));
const K = globalThis.SxKernel;
// Auto-unwrap shim: the post-JIT-Phase-1 kernel returns numbers, strings,
// booleans, and nil as opaque value handles ({_type, __sx_handle}). Tests
// expect plain JS values from K.eval like the pre-rewrite kernel did. Wrap
// once at boot rather than touching all 23 K.eval call sites.
if (K && typeof K.eval === 'function' && K.stringify) {
const _kEval = K.eval.bind(K);
K.eval = function(expr) {
const r = _kEval(expr);
if (r && typeof r === 'object' && typeof r._type === 'string') {
switch (r._type) {
case 'number': { const s = K.stringify(r); const n = Number(s);
return Number.isInteger(n) || /^-?\d+$/.test(s) ? n : (Number.isNaN(n) ? r : n); }
case 'string': return K.stringify(r);
case 'boolean': return K.stringify(r) === 'true';
case 'nil': return null;
default: return r; // list/dict/symbol — leave as handle
}
}
return r;
};
}
// Value-handle unwrap helper for native interop. The new kernel wraps atoms
// (number, string, boolean, nil) in {_type, __sx_handle} handles. JS natives
// receiving these in argument lists would do reference-equality on the handle
// instead of value-equality on the underlying primitive — breaking things
// like JS Set dedup (each literal `1` becomes a new handle). Unwrap before
// handing off to native JS.
function _unwrapHandle(v) {
if (v && typeof v === 'object' && typeof v._type === 'string' && K.stringify) {
switch (v._type) {
case 'number': { const s = K.stringify(v); const n = Number(s);
return Number.isInteger(n) || /^-?\d+$/.test(s) ? n : n; }
case 'string': return K.stringify(v);
case 'boolean': return K.stringify(v) === 'true';
case 'nil': return null;
default: return v;
}
}
return v;
}
// Step limit API — exposed from OCaml kernel
const STEP_LIMIT = parseInt(process.env.HS_STEP_LIMIT || '1000000');
@@ -645,35 +687,36 @@ const _log = _origLog; // keep reference for our own output
// JS-level reference equality for host objects (works around OCaml boxing).
// The SX `=` primitive doesn't do JS === for host objects in the WASM kernel.
K.registerNative('hs-ref-eq',a=>a[0]===a[1]);
K.registerNative('host-global',a=>{const n=a[0];return(n in globalThis)?globalThis[n]:null;});
K.registerNative('host-global',a=>{const n=_unwrapHandle(a[0]);return(n in globalThis)?globalThis[n]:null;});
K.registerNative('host-get',a=>{
if(a[0]==null)return null;
const k=_unwrapHandle(a[1]);
// SX lists (arrive as {_type:'list', items:[...]}) don't expose length/size
// through JS property access. Hand-roll common collection queries so
// compiled HS `x.length` / `x.size` works on scoped lists.
if(a[0] && a[0]._type==='list' && (a[1]==='length' || a[1]==='size')) return a[0].items.length;
if(a[0] && a[0]._type==='list' && typeof a[1]==='number') return a[0].items[a[1]]!==undefined?a[0].items[a[1]]:null;
if(a[0] && a[0]._type==='dict' && a[1]==='size') return Object.keys(a[0]).filter(k=>k!=='_type').length;
if(a[0] && a[0]._type==='list' && (k==='length' || k==='size')) return a[0].items.length;
if(a[0] && a[0]._type==='list' && typeof k==='number') return a[0].items[k]!==undefined?a[0].items[k]:null;
if(a[0] && a[0]._type==='dict' && k==='size') return Object.keys(a[0]).filter(x=>x!=='_type').length;
// innerText is DOM-level alias for textContent (close enough for mock purposes)
if(a[0] instanceof El && a[1]==='innerText') return String(a[0].textContent||'');
if(a[0] instanceof El && k==='innerText') return String(a[0].textContent||'');
// RPC dispatch object: _hsRpcDispatch bypasses Proxy-in-WASM-kernel nil issue
if(a[0] && typeof a[0]._hsRpcDispatch==='function'){const rv=a[0]._hsRpcDispatch(String(a[1]));return rv===undefined?null:rv;}
let v=a[0][a[1]];
if(a[0] && typeof a[0]._hsRpcDispatch==='function'){const rv=a[0]._hsRpcDispatch(String(k));return rv===undefined?null:rv;}
let v=a[0][k];
if(v===undefined)return null;
// Only coerce DOM property strings for actual DOM elements — plain JS objects
// (e.g. promise-state dicts with a "value" key) must not be stringified.
if(a[0] instanceof El&&(a[1]==='innerHTML'||a[1]==='textContent'||a[1]==='value'||a[1]==='className')&&typeof v!=='string')v=String(v!=null?v:'');
if(a[0] instanceof El&&(k==='innerHTML'||k==='textContent'||k==='value'||k==='className')&&typeof v!=='string')v=String(v!=null?v:'');
return v;
});
K.registerNative('host-set!',a=>{if(a[0]!=null){const v=a[2]; if(a[1]==='innerHTML'&&a[0] instanceof El){const s=v===null?'null':v===undefined?'':String(v);a[0]._setInnerHTML(s);a[0][a[1]]=a[0].innerHTML;} else if(a[1]==='textContent'&&a[0] instanceof El){const s=v===null?'null':v===undefined?'':String(v);a[0].textContent=s;a[0].innerHTML=s;for(const c of a[0].children){c.parentElement=null;c.parentNode=null;}a[0].children=[];a[0].childNodes=[];} else{a[0][a[1]]=v;}} return a[2];});
K.registerNative('host-call',a=>{if(_testDeadline&&Date.now()>_testDeadline)throw new Error('TIMEOUT: wall clock exceeded');const[o,m,...r]=a;if(o==null){const f=globalThis[m];return typeof f==='function'?f.apply(null,r):null;}if(o&&typeof o[m]==='function'){try{const v=o[m].apply(o,r);return v===undefined?null:v;}catch(e){return null;}}return null;});
K.registerNative('host-call-fn',a=>{const[fn,argList]=a;if(typeof fn!=='function'&&!(fn&&fn.__sx_handle!==undefined))return null;const callArgs=(argList&&argList._type==='list'&&argList.items)?Array.from(argList.items):(Array.isArray(argList)?argList:[]);if(fn&&fn.__sx_handle!==undefined){try{return K.callFn(fn,callArgs);}catch(e){const msg=e&&e.message||'';if(String(msg).includes('TIMEOUT'))throw e;return null;}}function sxToJs(v){if(v&&v._type==='list'&&v.items)return Array.from(v.items).map(sxToJs);return v;}try{const v=fn.apply(null,callArgs.map(sxToJs));return v===undefined?null:v;}catch(e){return null;}});
K.registerNative('host-new',a=>{const C=typeof a[0]==='string'?globalThis[a[0]]:a[0];return typeof C==='function'?new C(...a.slice(1)):null;});
K.registerNative('host-set!',a=>{if(a[0]!=null){const k=_unwrapHandle(a[1]);const v=_unwrapHandle(a[2]); if(k==='innerHTML'&&a[0] instanceof El){const s=v===null?'null':v===undefined?'':String(v);a[0]._setInnerHTML(s);a[0][k]=a[0].innerHTML;} else if(k==='textContent'&&a[0] instanceof El){const s=v===null?'null':v===undefined?'':String(v);a[0].textContent=s;a[0].innerHTML=s;for(const c of a[0].children){c.parentElement=null;c.parentNode=null;}a[0].children=[];a[0].childNodes=[];} else{a[0][k]=v;}} return a[2];});
K.registerNative('host-call',a=>{if(_testDeadline&&Date.now()>_testDeadline)throw new Error('TIMEOUT: wall clock exceeded');const[o,mRaw,...r]=a;const m=_unwrapHandle(mRaw);if(o==null){const f=globalThis[m];return typeof f==='function'?f.apply(null,r.map(_unwrapHandle)):null;}if(o&&typeof o[m]==='function'){try{const v=o[m].apply(o,r.map(_unwrapHandle));return v===undefined?null:v;}catch(e){return null;}}return null;});
K.registerNative('host-call-fn',a=>{const[fn,argList]=a;if(typeof fn!=='function'&&!(fn&&fn.__sx_handle!==undefined))return null;const callArgs=(argList&&argList._type==='list'&&argList.items)?Array.from(argList.items):(Array.isArray(argList)?argList:[]);if(fn&&fn.__sx_handle!==undefined){try{return K.callFn(fn,callArgs);}catch(e){const msg=e&&e.message||'';if(String(msg).includes('TIMEOUT'))throw e;return null;}}function sxToJs(v){if(v&&v._type==='list'&&v.items)return Array.from(v.items).map(sxToJs);return _unwrapHandle(v);}try{const v=fn.apply(null,callArgs.map(sxToJs));return v===undefined?null:v;}catch(e){return null;}});
K.registerNative('host-new',a=>{const nameOrCtor=_unwrapHandle(a[0]);const C=typeof nameOrCtor==='string'?globalThis[nameOrCtor]:nameOrCtor;return typeof C==='function'?new C(...a.slice(1).map(_unwrapHandle)):null;});
K.registerNative('host-callback',a=>{const fn=a[0];if(typeof fn==='function'&&fn.__sx_handle===undefined)return fn;if(fn&&fn.__sx_handle!==undefined)return function(){const r=K.callFn(fn,Array.from(arguments));if(globalThis._driveAsync)globalThis._driveAsync(r);return r;};return function(){};});
K.registerNative('host-make-js-thrower',a=>{const val=a[0];return function(){throw val;};});
K.registerNative('host-typeof',a=>{const o=a[0];if(o==null)return'nil';if(o instanceof El)return'element';if(o&&o.nodeType===3)return'text';if(o instanceof Ev)return'event';if(o instanceof Promise)return'promise';return typeof o;});
K.registerNative('host-iter?',([obj])=>obj!=null&&typeof obj[Symbol.iterator]==='function');
K.registerNative('host-to-list',([obj])=>{try{return[...obj];}catch(e){return[];}});
K.registerNative('host-make-js-thrower',a=>{const val=_unwrapHandle(a[0]);return function(){throw val;};});
K.registerNative('host-typeof',a=>{let o=a[0];if(o==null)return'nil';if(o&&typeof o==='object'&&typeof o._type==='string'&&'__sx_handle' in o)return o._type;if(o instanceof El)return'element';if(o&&o.nodeType===3)return'text';if(o instanceof Ev)return'event';if(o instanceof Promise)return'promise';return typeof o;});
K.registerNative('host-iter?',([obj])=>{const o=_unwrapHandle(obj);return o!=null&&typeof o[Symbol.iterator]==='function';});
K.registerNative('host-to-list',([obj])=>{const o=_unwrapHandle(obj);try{return[...o];}catch(e){return[];}});
K.registerNative('host-await',a=>{});
K.registerNative('load-library!',()=>false);
K.registerNative('hs-is-set?',a=>a[0] instanceof Set);
@@ -706,10 +749,10 @@ Promise.resolve = function(v) {
K.registerNative('host-new-function', a => {
const paramList = a[0];
const src = a[1];
const src = _unwrapHandle(a[1]);
const params = paramList && paramList._type === 'list' && paramList.items
? Array.from(paramList.items)
: Array.isArray(paramList) ? paramList : [];
? Array.from(paramList.items).map(_unwrapHandle)
: Array.isArray(paramList) ? paramList.map(_unwrapHandle) : [];
try { return new Function(...params, src); } catch(e) { return null; }
});