The CS 6120 Course Blog

When Prototypes Learned to Run Fast: SELF and the Birth of Adaptive Optimization

by Jake Hyun, Tobi Weinberg, Adnan Armouti

Introduction: The Core Contribution of SELF

The SELF paper is often misunderstood as focusing purely on language design (prototypes, dynamic typing). Its true contribution lies in being one of the earliest successful attempts to take a maximally flexible language model and make it fast using adaptive optimization.

Many concepts central to modern Just-In-Time (JIT) compilers—including speculation, inlining, and tracking object shapes—originated here. In 1989, SELF challenged the prevailing wisdom that dynamic languages would inevitably be slow, changing the trajectory of dynamic language performance.

A Quick Primer on the SELF Language

SELF is characterized by its extreme dynamism and uniformity:

This elegance creates a challenge: a naive implementation would be overwhelmed by the cost of dynamic lookups.

How the System Achieved Performance

The SELF implementation introduced several foundational ideas to manage the late-bound nature of the language.

1. Maps (Hidden Classes)

To reclaim efficiency, the system creates Maps. These are shared, internal descriptors of an object’s memory layout and slot metadata.

2. Customized Dynamic Compilation

Instead of compiling one generic version of a method, SELF compiles many versions, specialized for each receiver “shape” (map).

3. Message Splitting and Type Prediction

4. Primitive Inlining

Operations like arithmetic and slot access, which appear as slow message sends in the source code, are recognized and aggressively inlined to just a few machine instructions.

5. Staying Interactive

Remarkably, SELF maintained a live programming environment. It supported:

Class Discussion and Key Themes

Flexibility vs. Performance Trade-off

We noted that SELF’s extreme flexibility comes at a significant cost, forcing the compiler to work very hard. The discussion questioned if dynamic typing still offers a superior speed advantage for prototyping, given the capabilities of modern static languages and tools (e.g., TypeScript, Rust, Scala).

Missing Costs in Evaluation

The evaluation’s focus on small benchmarks was a weakness. The paper largely ignored the critical costs of compile-time overhead, memory use, and code-size growth (due to message splitting), which were especially relevant on 1980s hardware.

Maps as Implicit Structure

While maps keep the language clean, their implicit nature hides performance behavior from the programmer. This means a developer cannot explicitly “lock in” an object’s shape to assist the optimizer.

Legacy and Surviving Ideas

The influence of SELF on modern runtimes is profound:

The key difference today is Deoptimization: modern systems jump back to a baseline form when speculation fails, rather than discarding the code entirely.

Metrics and Relevance

The MiMS (millions of messages per second) metric did not impress anyone. It is too tied to the message-passing model. Modern evaluation prioritizes throughput, latency, warm-up time, and memory footprint.

Designing a Modern SELF-like Language

To retain the spirit of SELF while being more practical, the class proposed:

Conclusion

The SELF paper’s lasting legacy is proving that high-performance is achievable in a highly dynamic model through aggressive, speculative implementation techniques. Its ideas are now foundational to how dynamic language VMs are built. The core lesson remains: performance does not have to dictate language design if the runtime is smart enough.