The Great Pyramid of JavaScript

Laxman K R2026.03.17

The ancient Egyptians built the Great Pyramid by stacking 2.3 million blocks, each one trusting blindly, permanently, the layer beneath it. JavaScript executes your code the same way. You type one line. Multiple layers later, a CPU twitches. What happens in between is what we're here for.

Sidenote:

Most developers, when pressed, will admit that they do not know how their code runs. They know that it does run, which they treat as sufficient. They npm run dev, a port opens, and they return to the more pressing matter of choosing a font. This post is not to shame them, but to illuminate the multiple-layer pyramid that silently executes their font-laden masterpiece every single time.

Here is the code we'll bury alive and excavate, layer by layer:

async function getUser() {
  const res = await fetch("https://api.example.com/user/1");
  const data = await res.json();
  console.log(data.name);
}

getUser();

Innocent enough. Now let's watch what the world does to it.

Layer 0 / The Editor and the Linter

Before this code goes anywhere, it is read by a linter. Eslint, Biome, OXC (Oxlint) take your pick. The linter is the building inspector who arrives before construction begins and walks the site with a clipboard. It does not pour concrete. It does not lay pipe. It looks at your plans and says: "This window is too close to the gas main. This door opens into a wall. This console.log will survive into production and embarrass you in front of the client."

The linter operates on the same AST the engine will later produce it just gets there first, reads it, and complains loudly before anything irreversible happens. Its output is not code. It is annotations, warnings, and the quiet satisfaction of a caught mistake.

Closely following the inspector is the formatter Prettier, Biome, the OXC (Oxcfmt) if you're living in the future. The formatter does not care what your code means. It cares that your indentation is consistent and that you haven't committed the sin of a 300-character line. Think of it as the plasterer who smooths every surface before the client walks in. The structure was already sound; the plasterer simply ensures it is presentable.

Neither of these tools executes your code. They exist entirely in the pre-execution layer, the planning phase, before a single electron moves in anger.

Layer 1 / The Bundler

You've written getUser in one file. Somewhere else, you've written the UI that calls it. Somewhere else still: a utility function, a type definition, a third-party library pulled from npm. In development, these files are a city of separate buildings. A browser cannot walk a city it needs a single address.

Enter the bundler: Vite, Rolldown, Turbopack, esbuild, webpack (for those of you who enjoy suffering). The bundler's job is to take your entire city of files and flatten it into a handful of deliverable artifacts. It traces the import graph getUser imports fetch, imports this, imports that and follows every thread to its origin, like a structural engineer mapping load paths through an entire building before signing off on the permit.

src/
  index.ts      ← imports getUser
  api/
    user.ts     ← defines getUser, imports fetch-wrapper
  utils/
    http.ts     ← defines fetch-wrapper
node_modules/
  some-lib/     ← also imported, transitively

         ↓  bundler traces every import

dist/
  bundle.js     ← one file, everything resolved

Modern bundlers do additional work in this phase: tree-shaking (removing code that is imported but never called), code splitting (producing multiple bundles that load on demand), and transpilation (converting your pristine TypeScript or modern JavaScript into something a wider range of environments can stomach).

Vite, to its credit, is clever about this. In development, it skips bundling almost entirely and serves individual files over native ES modules, leaning on the browser's own module resolution. It only bundles for production. Rolldown Vite's (Older version use Rollup) Rust-powered bundler, built on OXC will do this faster than you can consciously register. Turbopack, the Rust-powered successor to webpack embedded in Next.js, takes a similar approach. The tools are converging on the same idea: do the minimum possible work, as fast as possible, and get out of the way.

The output of the bundler is, finally, a JavaScript file. One file, or a small set of them, ready to be handed to a runtime.

Layer 2 / The Runtime: Node, Bun, or the Browser

Your bundled bundle.js now needs a home. If this is a server, that home is Node.js or Bun. If it's a browser, that home is Chrome, Firefox, or Safari. The distinction matters more than most people appreciate.

Node.js is, at its core, a thin shell around V8 Google's JavaScript engine combined with libuv, a C library that provides the event loop, the thread pool, and the operating system bindings that JavaScript itself cannot provide. When you call fetch in Node 18+, it is not V8 that makes the network request. V8 does not know what a network is. It is Node's built-in undici HTTP client, written in JavaScript and C++, wrapped in a Web-compatible fetch API, registered as a global so that your code can call it without knowing any of this.

Bun is the new challenger. Where Node wraps V8, Bun wraps JavaScriptCore Apple's JavaScript engine, the one that runs Safari and every JavaScript runtime on iOS. Bun is written in Zig, runs JSC, and reimplements Node's APIs from scratch with an emphasis on speed. The same fetch call, the same async/await syntax but underneath, a different engine, a different event loop implementation, different tradeoffs.

The runtime is the building site itself: the land, the utilities, the access roads. The engine is the crew. You cannot have one without the other.

Layer 3 / The Engine: V8, JavaScriptCore

Now we arrive at the engine proper. V8 powers Node, Chrome, and Deno. JSC powers Bun and Safari. They differ in implementation but are identical in contract: they take JavaScript source, and they execute it.

The Surveyor: Tokenizer

The engine reads your source file character by character and chops it into tokens the smallest units of meaning the language recognizes.

KEYWORD(async)  KEYWORD(function)  IDENT(getUser)  PUNCT(()  PUNCT())
PUNCT({)  KEYWORD(const)  IDENT(res)  PUNCT(=)  KEYWORD(await)
IDENT(fetch)  PUNCT(()  STRING("https://api.example.com/user/1")  PUNCT())
...

No walls. No floors. Just labeled ground. The tokenizer doesn't know that fetch is a function or that await implies a suspension point it only knows that these sequences of characters mean something to the grammar above it. Its job ends there, and it hands off.

The Architect's Drawing: Parser → AST

A list of tokens is not a program. The parser folds them into an Abstract Syntax Tree a nested structure that captures not just what the tokens are, but how they relate.

{
  "type": "FunctionDeclaration",
  "async": true,
  "id": { "name": "getUser" },
  "body": [
    {
      "type": "VariableDeclaration",
      "kind": "const",
      "declarations": [
        {
          "id": { "name": "res" },
          "init": {
            "type": "AwaitExpression",
            "argument": {
              "type": "CallExpression",
              "callee": { "name": "fetch" },
              "arguments": [{ "value": "https://api.example.com/user/1" }]
            }
          }
        }
      ]
    }
  ]
}

This is the structural drawing. It tells you that fetch(...) lives inside an AwaitExpression, which lives inside a VariableDeclaration, which lives inside an async FunctionDeclaration. An architect doesn't merely list materials she specifies that the window goes in the load-bearing wall, and not through it. Relationship is everything.

The Work Order: Bytecode

The AST cannot be handed directly to a CPU. It must be translated into bytecode a compact, portable instruction set for V8's Ignition interpreter (or JSC's LLInt, its equivalent).

CreateClosure [getUser]
CallUndefinedReceiver [getUser]
SuspendGenerator        // first `await` pause here
CallRuntime [fetch, "https://api.example.com/user/1"]
ResumeGenerator         // fetch resolved continue
StoreAccumulatorInRegister [res]
...

The foreman on a job site doesn't explain load distribution to the bricklayer. He says: "put the brick here." Bytecode is that instruction. Still somewhat legible. Not yet the thing that moves electrons.

The Concrete: JIT Compilation

If getUser were called once, Ignition would interpret the bytecode line by line adequate, if not fast. But both V8 and JSC watch. They instrument your code, identify the hot paths functions called repeatedly and promote them to an optimizing compiler.

V8 uses TurboFan (and increasingly Maglev as a mid-tier). JSC uses DFG and FTL. All of them output raw machine code binary instructions the CPU executes directly, with no interpretation overhead.

mov rdi, [fetch_ptr]
mov rsi, url_string_addr
call [rdi]
test rax, rax
jz   handle_rejection
...

This is the poured concrete. The CPU has no concept of async, no concept of fetch, no concept of JavaScript. It knows registers, memory addresses, and conditional jumps. Everything above this layer is, from the CPU's perspective, pure fiction useful fiction, elaborately maintained, but fiction nonetheless.

Layer 4 / The Site Manager: Event Loop

Here is where the story gets genuinely strange. JavaScript is single-threaded one worker, one call stack. And yet it handles a network request without freezing. This is not magic. It is the event loop, and it is worth understanding precisely.

When execution hits await fetch(...):

// Moment of the await:
Call Stack: [ getUser, (anonymous) ]

// getUser() hits `await`. It is SUSPENDED and removed from the stack.
// The Promise is handed to Node's networking layer a subcontractor
// operating entirely outside the JS thread, in libuv's thread pool.

Call Stack: [ (anonymous) ]  ← nearly empty. The worker is FREE.

// ...time passes. The network responds. The Promise resolves.
// The resolved value enters the Microtask Queue.
// The Event Loop checks: call stack empty? Yes.
// getUser() is reinstated, res bound to the Response object.

Call Stack: [ getUser, (anonymous) ]  ← resumed, mid-function.

The Event Loop is the site manager. The main worker the call stack never idles waiting on a subcontractor. The network request is offloaded entirely. When the subcontractor delivers, the manager queues the result and hands it to the worker the moment they're free.

await is not sleep. It is a suspension, a handoff, a politely-worded resignation letter that says: "I'll be back when the data arrives." The site does not stop. Other work proceeds. This is why Node can serve thousands of concurrent requests on a single thread not because it parallelizes, but because it never waits.

The Full Pyramid

Each layer trusts the one below it entirely, knows nothing of the ones above, and does exactly one job. The linter doesn't bundle. The bundler doesn't optimize. The JIT doesn't manage async. The CPU doesn't parse.

You typed getUser(). Eleven layers executed it. Most of them without your knowledge, your consent, or your gratitude.

Perhaps now they'll get some.