← Back to articles
Rust, Game Engines, Garbage Collector, Optimization

Rust Vs. Doom (1993)

Gustavo Hammerschmidt · 18:52 28/Mar/2026 · 15 min
3 views

Post Cover Image


A Brief Background Of Rust Soul's Origin?

At first glance, Haskell and Rust appear to occupy similar intellectual territory: strong type systems, fearless refactoring, compile-time guarantees, and clear functional influence. It's easy to frame them as two heavyweights competing for the same space. But that comparison weakens under scrutiny. Haskell was designed to explore purity, abstraction, and mathematical elegance. Rust was engineered to replace C in production systems. One optimizes for expressive precision and theoretical clarity; the other prioritizes predictable performance and direct control over memory. They don't fight on the same battlefield. Rust powers infrastructure, kernels, and engines. Haskell refines thinking and models complex domains with remarkable precision. The overlap is philosophical, not practical.

Where Haskell Looks Like a Competitor


Both languages share:

  • Strong static type systems
  • Immutability by default
  • Functional programming influence
  • Compile-time correctness emphasis
  • Expression-oriented design
  • A culture of fearless refactoring

Both aim to eliminate entire classes of runtime errors. Both make illegal states difficult (or sometimes impossible) to encode. Both encourage modeling the world through types instead of documentation.


Rust draws significant inspiration from Haskell:

  • Algebraic data types (enum in Rust)
  • Pattern matching
  • Option / Result instead of null
  • Type-driven design
  • Trait-based polymorphism influenced by typeclasses
  • A preference for purity in API boundaries

The influence is immediately visible. Rust feels like a systems language educated in functional programming principles. But the key distinction lies in enforcement: Haskell enshrines purity at the language level, while Rust enforces memory safety at compile time.

Haskell optimizes for mathematical rigor. Rust optimizes for operational constraints. They share ideas. They diverge in mission.


Why Rust Has Overthrown Its Master?

Aspect Haskell Rust
Memory Garbage collected Ownership model, no GC
Predictability GC pauses possible Deterministic
Low-level control Limited Extensive

This effectively excludes Haskell from:

  • OS development
  • Embedded systems
  • Real-time engines

The contrast becomes clearer when examining industry adoption. Rust has gained traction across systems programming, cloud infrastructure, blockchain, and performance-sensitive services --- domains where timing guarantees and memory control are critical. Haskell remains influential but specialized, finding strength in academia, finance, and compiler design. This divergence reflects a difference in learning curve versus practical payoff. Haskell introduces laziness, monads, and advanced type abstractions early; concepts that elevate reasoning but don't always translate directly to shipping low-level systems. Rust is also demanding, but its difficulty mirrors tangible constraints: ownership, lifetimes, concurrency, memory layout. The struggle aligns with real engineering trade-offs. That's why Haskell isn't truly a Rust competitor. It profoundly shaped Rust's thinking, but Rust grounded those ideas in systems pragmatism.


Why Can't Game Engines Have Garbage Collectors?

They can, and many do. The real question is whether they can tolerate the unpredictability that often comes with them. Game engines are real-time systems. Every frame must complete within a strict time window. Miss that window, even briefly, and the player doesn't perceive a minor delay: they experience a hitch, a stutter, a break in immersion.


Frame Budget

  • 60 FPS -> 16.67 ms per frame
  • 120 FPS -> 8.33 ms
  • 240 FPS -> 4.16 ms

At 60 frames per second, the engine has just 16.67 milliseconds to process input, update AI, simulate physics, execute gameplay logic, and submit rendering commands. At 120 FPS, that budget shrinks to 8.33 milliseconds. At 240 FPS, it drops to 4.16 milliseconds. These aren't flexible targets; they are hard deadlines. A 2--5 millisecond garbage collection pause may seem insignificant in isolation, but at 120 FPS it can consume more than half of the entire frame budget. The result is not theoretical, it's visible stutter.

This exposes the fundamental tension: garbage collectors optimize long-term throughput and developer ergonomics, while game engines optimize worst-case timing guarantees. A single unexpected spike can disrupt immersion. For that reason, high-performance engines either avoid GC in hot paths or carefully structure allocations so collection never occurs during gameplay-critical moments. The issue isn't that GC is inherently flawed, it's that real-time systems demand stricter temporal guarantees than most collectors are designed to provide.


GCs vs. Game Engines

Garbage collectors and game engines operate under different assumptions about time. A GC assumes the program can occasionally pause to reorganize memory, trading brief interruptions for long-term convenience. A game engine assumes it cannot pause at all during a frame. In typical desktop applications, a short stall is barely noticeable. In a 120 FPS game, the same stall consumes a substantial fraction of the frame window and immediately breaks smoothness. Real-time systems prioritize worst-case behavior, not averages. Combined with the high allocation churn from particles, physics interactions, AI decisions, and temporary gameplay state, collectors can be placed under constant pressure. The clash isn't ideological, it's architectural.


Thought experiment: Compare DOOM (1993) if written in C++, Java, JavaScript, Python, and Rust.


Baseline: Original DOOM (C)

  • 35 FPS fixed tick
  • ~500-700 KB binary
  • ~8-12 MB RAM
  • Deterministic
  • No GC

C++

  • 35 FPS
  • Slightly larger binary
  • Deterministic
  • Nearly identical to C

Rust

  • 35 FPS
  • Slightly larger binary
  • Deterministic
  • Safer guarantees
  • No runtime GC cost

Equivalent to C in raw performance characteristics.


Java (1993 JVM)

  • 5-15 FPS
  • High memory usage
  • GC pauses
  • Not viable in 1993

A modern JVM could reach 35 FPS, but garbage collection and runtime overhead would still complicate strict determinism.


JavaScript (Modern Engines)

  • 35 FPS possible
  • Large memory footprint
  • GC pauses
  • Non-deterministic timing

Python

  • 1-5 FPS in pure Python
  • Heavy interpreter overhead
  • GC pauses
  • Not viable

Summary Table

Language FPS GC Determinism 1993 Viable
C 35 No Yes Yes
C++ 35 No Yes Yes
Rust 35 No Yes N/A
Java (1993) 5-15 Yes No No
JavaScript 35 Yes No No
Python 1-5 Yes No No

Expanded Discussion: ECS, Unity vs Bevy, WASM, and the Carmack Philosophy

When debates arise about garbage collection in games, they often overlook a deeper architectural narrative. The question isn't merely about memory management: it's about data layout, execution patterns, and the philosophy behind performance decisions.

DOOM: OOP vs. ECS Before ECS Existed

The original DOOM predates the formalization of Entity Component Systems, yet its structure resembles a primitive ECS. Instead of deep inheritance hierarchies and pointer-heavy object graphs, it relied on flat arrays and compact data layouts. Game objects were processed in predictable, linear loops. Memory access was contiguous and cache-friendly long before "cache locality" became a mainstream topic.

This wasn't stylistic preference, it was necessity. Early 1990s hardware punished indirection and cache misses. Iterating over arrays was simply the fastest option available.

Modern ECS formalizes this approach. Data is separated into components stored contiguously, and systems iterate over them in tight loops. The benefit isn't just architectural clarity, it's alignment with how CPUs actually operate. Processors reward linear memory access and predictable execution paths. In that sense, ECS isn't revolutionary; it's a disciplined evolution of principles high-performance programmers were already practicing under tighter constraints.


DOOM was effectively proto-ECS:

  • Flat arrays
  • Minimal pointer indirection
  • Tight loops over contiguous memory

Modern ECS refines and systematizes these instincts.


Unity (GC) vs. Bevy (Rust)

Now consider two contemporary engines: Unity and Bevy.

Unity, built on C#, uses a garbage-collected runtime. This lowers the barrier to entry and accelerates iteration. Memory management is largely automated, and developers can focus on gameplay rather than allocation bookkeeping. Unity's success proves that GC does not make game development impossible.

However, experienced Unity developers quickly learn to manage allocations carefully. They avoid creating objects inside update loops, rely on object pooling, and monitor GC spikes in profiling tools. In practice, performance-sensitive Unity code often behaves like manual memory management inside a managed environment. The collector remains present, but disciplined coding minimizes its impact.

Bevy, written in Rust, takes a different route. It embraces an ECS-first architecture and Rust's ownership model. There is no garbage collector; memory safety is enforced at compile time. The trade-off is upfront complexity (lifetimes, borrowing rules, architectural rigor). The reward is deterministic execution. If no allocations occur, no runtime system will intervene.

The distinction isn't about modernity, it's about where complexity resides. Unity concentrates more complexity at runtime and in developer discipline. Bevy shifts that complexity into compile-time guarantees and structural design. Both can ship games. They simply embody different philosophies of control.


Feature Unity Bevy
GC Pauses Yes No
Predictability Medium High
Safety Runtime Compile-time

Unity succeeds despite GC --- but often by carefully working around it.


The Carmack Philosophy

John Carmack consistently emphasized simplicity, data locality, and performance transparency. His approach was less about paradigm and more about mechanical sympathy: understand what the hardware does, minimize hidden costs, and structure data to match how processors behave.

Garbage collectors introduce a layer of opacity. They simplify memory management but abstract away timing details. Techniques like ECS, flat arrays, and arena allocators align more closely with Carmack's philosophy because they make costs visible and predictable.

Managed languages are not inherently inferior. But real-time systems reward architectural honesty. When targeting a 4-millisecond frame budget at 240 FPS, predictability becomes more valuable than convenience.


Conclusion

At the intersection of language design and game engine architecture lies a single recurring theme: trade-offs. Haskell and Rust may share philosophical DNA, but they diverge in purpose — one refining abstraction and mathematical clarity, the other enforcing safety under real-world constraints. The same tension appears in the garbage collection debate. Managed runtimes optimize for developer velocity and long-term throughput; game engines optimize for deterministic timing and worst-case guarantees. DOOM’s data-oriented structure, modern ECS patterns, and Rust’s ownership model all point toward the same underlying principle: performance-critical systems reward predictability, transparency, and mechanical sympathy with the hardware. None of these tools are inherently superior in isolation... but in real-time environments where milliseconds define the experience, control almost always wins over convenience.