Synapse executes untrusted code in a deterministic, sandboxed WebAssembly arena. No filesystem. No network from guest code. Memory wiped between every request.
Each execution runs in a 64KB Wasm linear memory arena.
The gateway is a single ~15 MB Rust binary with zero external runtime dependencies.
Synapse is not a general-purpose operating system. The production runtime runs on Linux via wasmtime. The UEFI bare-metal kernel in boot/ is a research component — it demonstrates the architecture can eliminate the OS entirely, but it is not the production path today.
The Python support is a constrained subset (arithmetic, conditionals, loops, functions, lists). It is not a general CPython replacement.
Verified execution maps cleanly to compliance and safety problems in regulated environments. The current product is a self-hosted runtime; broader domain-specific assurance remains part of the R&D stack.
AI loan approvals, risk calculations, and compliance reporting must be explainable, reproducible, and auditable. Regulators don’t accept “the model said so.”
Current proof path: deterministic execution, receipts, and a bounded verification boundary for generated logic. Broader privacy and compliance guarantees are active research, not the default product claim.
Diagnostic AI must never return values outside calibrated ranges. Proving “never returns dangerous values” requires more than statistical testing.
Relevant R&D: output bounds, self-gating inference, and verified fine-tuning are demonstrated research threads. The current stable gateway boundary remains limited to @inv pure, @inv terminates, and @inv no_oob.
AI infrastructure depends entirely on foreign vendors (NVIDIA, Microsoft, Google). Data sovereignty laws require domestic control of sensitive computation.
Current product:self-hosted execution on commodity hardware with no guest filesystem, no guest network, and deterministic receipts. Deeper sovereignty work such as bare-metal and networked compute remains R&D.