Fundamentals 23 min read

A Tour of Metaprogramming Models for Generics

This article surveys how various programming languages implement generic programming, comparing approaches such as boxing, monomorphisation, type erasure, vtables, dictionary passing, and compile‑time code generation, and illustrates each method with code examples from Go, Java, Rust, C++, D, and others.

High Availability Architecture
High Availability Architecture
High Availability Architecture
A Tour of Metaprogramming Models for Generics

When designing programs we often want a single data structure or algorithm to work with many element types. Different languages solve this problem in different ways, ranging from simple generic functions to full‑blown, Turing‑complete generic systems. This article walks through the evolution of generic mechanisms across many languages, starting with languages that lack generics (e.g., C) and moving toward increasingly sophisticated systems.

Basic ideas

Two fundamental strategies address the generic problem: boxing (storing values behind a uniform pointer or interface) and monomorphisation (generating a separate copy of code for each concrete type). Boxing simplifies code reuse but incurs runtime overhead, while monomorphisation yields fast code at the cost of larger binaries and longer compile times.

Boxing

In Go, a generic stack can be written using interface{} values:

type Stack struct {
  values []interface{}
}

func (s *Stack) Push(v interface{}) { s.values = append(s.values, v) }
func (s *Stack) Pop() interface{} {
  x := s.values[len(s.values)-1]
  s.values = s.values[:len(s.values)-1]
  return x
}

Similar patterns appear in C (using void* ), Java (pre‑generics Object ), and Objective‑C ( id ).

Type‑erasure boxing

Languages such as Java and Objective‑C later added generics on top of their existing boxed collections, erasing type information at runtime while providing compile‑time checks:

List v = new ArrayList();
v.add("test"); // runtime ClassCastException when cast to Integer

List
v = new ArrayList<>();
v.add("test"); // compile‑time type safety

Interface vtables

Go’s interface implementation and Rust’s dyn Trait objects use a hidden pointer to a vtable that stores function pointers for the concrete type. This enables generic code to call type‑specific operations without boxing the data itself.

type Interface interface {
  Len() int
  Less(i, j int) bool
  Swap(i, j int)
}

Java implements interfaces similarly, embedding a vtable pointer at the start of each object.

Reflection

Languages with vtables can also expose runtime type information (field names, types, offsets), enabling reflection‑based features such as serialization. Java, C#, and Go all provide reflective APIs, though they may incur performance penalties.

Dynamic‑type languages

Python, Ruby, and other dynamic languages rely on hash‑table‑based objects and extensive reflection, allowing arbitrary fields and methods to be added at runtime. Their JITs (e.g., V8) use hidden‑class structures similar to vtables.

Dictionary passing

Haskell’s type‑class dictionaries and OCaml’s first‑class modules pass explicit function tables to generic code, enabling compile‑time specialization while avoiding boxing.

Swift witness tables

Swift combines dictionary passing with compile‑time inlining: generic code is monomorphised when possible, and a “witness table” carries the operations needed for truly generic cases.

Monomorphisation

Generating separate code for each concrete type can be done at the source level (C macros, Go genny ), via language‑specific facilities (D string mixins, Zig comptime ), or by compiler phases (C++/D templates, Rust procedural macros).

template
T myMax(T a, T b) { return a > b ? a : b; }

D’s isNumeric!T constraint and Rust’s fn my_max<T: PartialOrd>(a: T, b: T) -> T illustrate constrained monomorphisation.

Compile‑time functions and meta‑programming

D templates can evaluate at compile time, and Rust’s procedural macros transform token streams while preserving source locations. AST‑level macros exist in Template Haskell, Nim, OCaml PPX, and Lisp families.

Machine‑code monomorphisation

One speculative approach pushes monomorphisation to the binary level, copying machine‑code templates for each concrete type and patching them at link or JIT time. No concrete language currently implements this, but it illustrates a possible future direction.

The article concludes that understanding these diverse generic models helps developers choose the right approach for their language and performance requirements.

metaprogramminggenericsLanguage DesignType ErasureTemplatesmonomorphisation
High Availability Architecture
Written by

High Availability Architecture

Official account for High Availability Architecture.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.