Z-Machine Foundations and Storage Sync
Pushing the Z-Machine interpreter forward with decoder work, text handling, and SAVE/RESTORE—plus content-addressable storage with S3 sync.
The Z-Machine decoder kept lying about operand boundaries. Games expect one truth; the spec suggests another; my implementation was somewhere in between. That’s where I spent most of the day: making the decode path match what real story files actually do, not what the documentation hopes they will.
The core decode path is short but loaded. It classifies the opcode, decodes operands, then checks for store and branch metadata. This is the choke point that turns raw memory into meaning:
pub fn decode_instruction(memory: &[u8], pc: usize) -> Result<Instruction, String> {
if pc >= memory.len() {
return Err("PC out of bounds".to_string());
}
let opcode_byte = memory[pc];
let mut offset = pc + 1;
let (opcode_type, opcode_num) = decode_opcode_type(opcode_byte);
let operands = match opcode_type {
OpcodeType::OP0 => vec![],
OpcodeType::OP1 => decode_operands_1op(memory, &mut offset, opcode_byte)?,
OpcodeType::OP2 => decode_operands_2op(memory, &mut offset, opcode_byte)?,
OpcodeType::VAR => decode_operands_var(memory, &mut offset)?,
OpcodeType::EXT => return Err("Extended opcodes not supported".to_string()),
};
let store_var = if opcode_stores_result(opcode_type, opcode_num) {
if offset < memory.len() {
let var = memory[offset];
offset += 1;
Some(var)
} else {
None
}
} else {
None
};
let branch = if opcode_has_branch(opcode_type, opcode_num) {
decode_branch(memory, &mut offset)?
} else {
BranchInfo::default()
};
Ok(Instruction {
opcode: opcode_num,
opcode_type,
operands,
store_var,
branch,
length: offset - pc,
..Default::default()
})
}
The Z-Machine spec reads like a small, elegant document; the implementation reads like archaeology. I didn’t hit one “big bug.” I hit dozens of edge cases where the spec says one thing and every Infocom game assumes another. Branch offset calculations, undefined local handling, operand type encoding in the push instruction—each fix tiny, each consequence cascading.
Once the decoder path is stable, the rest can hang off it: Z-string decoding, call frames, and the early SAVE/RESTORE shape. Stack and call frames went in today (Phase 4 in my tracker), which means the interpreter can now push and pop execution contexts. Still not enough to run a real game, but enough to trace through the early turns of Zork without the interpreter losing its place.
I also folded in a shared CPU state format so save/load stops being bespoke per core. The contract is blunt and consistent: registers, memory, program counter, and enough metadata to rehydrate the machine without guessing.
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct CpuState {
pub architecture: String,
pub version: u32,
pub registers: Vec<u8>,
pub memory: Vec<u8>,
pub pc: u32,
pub sp: u32,
pub flags: u32,
pub halted: bool,
pub cycles: u64,
#[serde(default)]
pub metadata: serde_json::Value,
}
The serial stack needed the same kind of truth. CTS is not a suggestion; it is backpressure. The connection manager now queues output when CTS drops and drains it when it returns, which keeps the modem boundary honest instead of “mostly fine.”
private handleCtsChange(ctsValue: boolean): void {
const wasReady = this.ctsReady;
this.ctsReady = ctsValue;
if (ctsValue && !wasReady && this.outputQueue.length > 0) {
this.log(`[ConnectionManager] CTS high, draining ${this.outputQueue.length} queued chunks`);
this.drainOutputQueue();
} else if (!ctsValue && wasReady) {
this.log('[ConnectionManager] CTS low, queueing subsequent output');
}
}
On the storage side, I moved from “local toy” to “real system.” The cloud storage layer is content-addressable with SHA-256, and larger blobs are stored in S3 under hash-based keys. That gives me deduplication as a first-order property rather than a bolt-on. The frontend adapter keeps a local cache, queues writes while offline, and syncs by comparing hashes when the socket is available. The model is simple: the content hash is the file’s identity; everything else is metadata.
REXX and Scheme progressed in parallel—lexer, parser, and WASM integration for REXX; R5RS compliance features for Scheme. The point is the same for both: the platform should make new languages feel repeatable, not heroic. Forth-C also progressed, partly to prove that “non-Rust core” is still a first-class path.
Tomorrow the web app moves to web/, which will force every path alias and import to be explicit. The parser migrations to Pest/Pratt are queued after that. The decoder work isn’t done—extended opcodes, object manipulation, and the full text system are still ahead—but today’s fixes mean I can step through a story file without the PC drifting into garbage.
Previous: 2026-01-25