Skip to main content
Back to Articles
2026 / 02
| 4 min read

Deep Dive: Lazy Loading 42 Backends with Dynamic Imports

How I shrank the emulator's main bundle from 1MB to 104KB by lazy-loading backends during dial tones—and why users never notice the delay.

emulator vite performance code-splitting typescript lazy-loading
On this page

The emulator hit a megabyte.

I’d been adding backends steadily—games, language interpreters, CPU emulators, BBS services—and one day the bundle analyzer showed the main chunk at 1.1MB. Every visitor paid for every backend, even if they only wanted to dial Zork.

The fix wasn’t complicated, but the interesting part was making it invisible. I didn’t want a loading spinner; I wanted the modem experience itself to absorb the wait. Dial tones and handshakes already take time. That time became my loading bar.


The Problem: Static Imports, One Giant Chunk

The original registry was straightforward: import every backend at the top, map it by phone number, call it a day.

// The old way: static imports
import { BasicInterpreter } from '@/backend/basic-interpreter';
import { ForthInterpreter } from '@/backend/forth-interpreter';
import { DungeonCrawler } from '@/backend/dungeon';
import { ZMachineBackend } from '@languages/zil/backend-zmachine';
// ... dozens more imports ...

export const BACKEND_REGISTRY: Record<string, typeof BackendInterface> = {
  '5550300': BasicInterpreter,
  '5550400': ForthInterpreter,
  '5550100': DungeonCrawler,
  '5550365': ZMachineBackend,
  // ...
};

Vite did exactly what it was asked: bundle everything into the main chunk. The registry held references to every backend class, and every class dragged in its dependencies. A megabyte-sized hello.

The Fix: Loader Functions + Dynamic Imports

The registry now stores loader functions instead of classes. Each one does a dynamic import, and Vite automatically splits it into its own chunk.

// web/src/main/modules/backend-registry.ts
export type BackendLoader = () => Promise<new () => BackendInterface>;

export const BACKEND_REGISTRY: Record<string, BackendLoader> = {
  // === Games ===
  '5550100': () => import('@/backend/dungeon').then((m) => m.DungeonCrawler),
  '5550238': () => import('@/backend/adventure').then((m) => m.ColossalCaveAdventure),

  // === Language Interpreters ===
  '5550300': () => import('@/backend/basic-interpreter').then((m) => m.BasicInterpreter),
  '5550365': () => import('@languages/zil/backend-zmachine').then((m) => m.ZMachineBackend),

  // ... 42 more loaders ...
} as const;

Vite’s manualChunks config excludes the backend directory from the shared chunks, letting dynamic imports do their work:

// vite.config.ts
manualChunks: (id) => {
  // Backend shared infrastructure only - individual backends are lazy-loaded
  // via dynamic imports and get their own chunks automatically
  if (
    id.includes('backend-interface') ||
    id.includes('backend-disk-storage') ||
    id.includes('cpu-emulator-base')
  ) {
    return 'backend-base';
  }
  // Don't assign backends to a chunk - let dynamic imports create chunks
};

Now the main bundle is just the emulator shell and shared infrastructure. Each backend arrives only when it’s dialed.

The Timing Trick: Preload During Dial Tones

Here’s where the modem fantasy becomes useful. Dialing already takes time: dial tone, DTMF digits, ringback, handshake. At 300 baud with a Bell 103 handshake, that’s several seconds of expected delay. I use it as a loading screen.

When the modem starts dialing, the connection manager kicks off a preload:

// web/src/serial/connection-manager.ts
modemImpl.onDial((number) => {
  this.dialedNumber = normalizePhoneNumber(number);

  // Start preloading the backend while dial tones play
  if (this.backendFactory.preload) {
    const preloadPromise = this.backendFactory.preload(this.dialedNumber);
    if (preloadPromise) {
      this.log(`[ConnectionManager] Preloading backend for ${this.dialedNumber}`);
    }
  }
});

The RegistryBackendFactory tracks in-flight loads so multiple dials don’t cause duplicate fetches:

// web/src/serial/backend-process-adapter.ts
preload(phoneNumber: string): Promise<void> | null {
  const normalized = phoneNumber.replace(/[-\s()]/g, '');

  // Already loaded synchronously
  if (this.backends.has(normalized)) {
    return Promise.resolve();
  }

  // Already loading
  if (this.pendingLoads.has(normalized)) {
    return this.pendingLoads.get(normalized)!.then(() => {});
  }

  // Has a loader - start loading
  const loader = this.loaders.get(normalized);
  if (loader) {
    const loadPromise = loader();
    this.pendingLoads.set(normalized, loadPromise);

    loadPromise
      .then((BackendClass) => {
        this.backends.set(normalized, BackendClass);
        this.pendingLoads.delete(normalized);
      })
      .catch((err) => {
        this.pendingLoads.delete(normalized);
        console.error(`[BackendFactory] Failed to preload ${normalized}:`, err);
      });

    return loadPromise.then(() => {});
  }

  return null;
}

By the time the handshake finishes, the backend is usually already resident. The network fetch happened inside a delay the user expected.

Sync vs Async Creation

Some code paths still need a synchronous backend—tests, tooling, anything that can’t await. The factory supports both:

  • create(phoneNumber) — synchronous, fails if backend isn’t already loaded
  • createAsync(phoneNumber) — waits for loader or pending preload

The split keeps the call flow simple: if you’re dialing, you’re async; if you’re testing, you’re explicit about what you’ve preloaded.

WASM Backends: A Two-Stage Waterfall

WASM modules add another hop. The backend chunk loads first, then the WASM binary initializes inside onConnect().

// web/src/backend/z80-hello/index.ts
let Z80Emulator: any;
let wasmInitialized = false;

async function initWasm(): Promise<void> {
  if (wasmInitialized) return;
  const emuModule = await import('@cores/z80/pkg/z80_wasm.js');
  await emuModule.default();
  Z80Emulator = emuModule.Z80Emulator;
  wasmInitialized = true;
}

Preload hides the first hop (the JS chunk), but the WASM binary still initializes after connection. For CPU emulators, this adds a few hundred milliseconds. The Bell 103 handshake is around 3 seconds, so there’s usually room—but faster modems like V.32bis leave less slack.

Display Names Without Loading Code

The welcome screen needs friendly names (“Zork I”, “BASIC”, “Star Trek”), but lazy-loading means class names aren’t available up front. The fix is a separate mapping that lives alongside the registry:

// web/src/main/modules/backend-registry.ts
const PHONE_DISPLAY_NAMES: Record<string, string> = {
  '5550100': 'Dungeon Crawler',
  '5550238': 'Colossal Cave',
  '5550300': 'BASIC',
  '5550365': 'Zork I',
  // ... all 47 registered numbers ...
};

Display names are cheap; code is not. Separating them makes the welcome screen instant.

The Tradeoffs

Fast connections can still feel the load. If a backend is large and the dial sequence is short (tone dialing at high baud), createAsync() may still block visibly. The preload helps, but it’s not a guarantee.

More moving parts. The factory has sync/async paths and a pending-load map. The connection manager has to coordinate preload timing with dial events. It’s manageable, but there’s more plumbing to reason about.

WASM is still a waterfall. The backend chunk loads first, then the WASM binary initializes. Preload hides the first hop, not the second. For Z80 or 8088 backends, that second hop is noticeable on slow connections.


Hiding work inside expected delays isn’t a new idea—games have been doing it with loading screens for decades. What made this satisfying was that the delay was already there, doing nothing. The dial tone wasn’t just period-accurate nostalgia; it was three seconds of runway I’d been ignoring.

The bundle went from 1.1MB to 104KB for first paint. But the real win was simpler: the emulator stopped feeling like it was waiting for itself.


See also: Deep Dive: Bell 103 Audio Modem — the dial tones that hide your loading times

See also: Deep Dive: Bell 212A and V.32bis Handshakes — even longer handshakes for even more loading time