Back to flin
flin

Advanced Type Features: The Complete Picture

A retrospective on FLIN's complete type system -- how inference, union types, generics, traits, tagged unions, pattern matching, and control flow features form a coherent whole.

Thales & Claude | March 25, 2026 12 min flin
flintype-systemadvancedcompleteretrospective

This is a retrospective. Over the past fourteen articles, we traced the evolution of FLIN's type system from primitive types and inference through union types, generics, traits, tagged unions, pattern matching, destructuring, pipeline operators, tuples, type guards, the never type, bounds, while-let loops, labeled loops, and or-patterns.

Each article told the story of one feature in isolation. This article tells the story of how they fit together -- how features designed in sequence form a coherent system, how design decisions in Session 97 enabled features in Session 155, and how the entire type system serves FLIN's core purpose: making application development simple, safe, and expressive.

The Feature Map

Before looking at connections, here is the complete feature inventory:

FeatureSession(s)Category
Primitive types (int, number, text, bool)CoreFoundation
Special types (time, money, file, semantic)CoreFoundation
Type inference (bidirectional)CoreInference
Optional types (T?)CoreSafety
Collection types ([T], [K:V])CoreData
Entity typesCoreData + Persistence
Type coercion (int->number, etc.)CoreErgonomics
Elvis operator (?:)097Ergonomics
Destructuring (array, entity, nested)097-098Ergonomics
Union types (T \U)100Expressiveness
Slicing (list[1:5:2])100Ergonomics
Generic types ()101Polymorphism
Traits and impl blocks133-136Constraints
Tagged unions (enum with data)145Expressiveness
Pattern matching (match)145-157Control flow
Exhaustiveness checking136, 147Safety
Never type136, 147Safety
Pipeline operator (\>)150Ergonomics
Where clauses144, 150Constraints
Tuples142Data
Type guards (is)120Safety
While-let loops152Control flow
Break with value153Control flow
Labeled loops154Control flow
Or-patterns155Ergonomics

Twenty-five features across approximately sixty sessions. Each one designed, implemented, tested, and documented.

The Dependency Graph

These features are not independent. They form a dependency graph where later features rely on earlier ones:

Primitives + Inference
    |
    +-- Optional types
    |       |
    |       +-- Type guards (is)
    |       |       |
    |       |       +-- Type narrowing
    |       |               |
    |       |               +-- Exhaustiveness checking
    |       |
    |       +-- Elvis operator (?:)
    |       +-- While-let loops
    |
    +-- Entity types
    |       |
    |       +-- Destructuring
    |
    +-- Collection types
    |       |
    |       +-- Slicing
    |       +-- Destructuring
    |       +-- Pipeline operator
    |
    +-- Union types
    |       |
    |       +-- Type narrowing
    |       +-- Tagged unions
    |       |       |
    |       |       +-- Pattern matching
    |       |       |       |
    |       |       |       +-- Or-patterns
    |       |       |       +-- Exhaustiveness checking
    |       |       |       +-- While-let patterns
    |       |       |
    |       |       +-- Never type
    |       |
    |       +-- Generic types
    |               |
    |               +-- Traits
    |               |       |
    |               |       +-- Generic bounds
    |               |       +-- Where clauses
    |               |
    |               +-- Tuples
    |
    +-- Control flow
            |
            +-- Break with value
            +-- Labeled loops
            +-- While-let

Every line in this graph represents a design dependency: the feature below could not exist without the feature above. Tagged unions require union types (because a generic enum like Result<T, E> = T | E uses the union type infrastructure). Pattern matching requires tagged unions (because matching on enum variants requires the variant data representation). Exhaustiveness requires pattern matching and the never type.

How Design Decisions Propagate

Some early design decisions had consequences that were not apparent until much later.

The Vec Decision

In Session 100, we represented union types as Union(Vec<Type>) rather than Union(Box<Type>, Box<Type>). This seemed like a minor implementation detail at the time. But it had cascading effects:

  • Or-patterns (Session 155) reused the same Vec<Pattern> approach, because the pattern A | B | C mirrors the type A | B | C.
  • Exhaustiveness checking (Session 147) could iterate over union members as a flat list, simplifying the algorithm.
  • Type subtraction -- removing a type from a union -- was a simple retain operation on the vector.

If we had used a binary representation, each of these features would have needed recursive handling of nested unions.

The Separate DestructuringDecl Decision

In Session 097, we created Stmt::DestructuringDecl as a separate statement type rather than modifying VarDecl. This seemed conservative at the time -- a way to avoid touching 190 call sites.

But it paid dividends later. When while-let loops (Session 152) needed pattern matching in loop conditions, they could use the same Pattern enum without any interaction with VarDecl. When for-loop destructuring was added, it used the same Pattern infrastructure. The separate type meant that patterns were a standalone concept, usable in any context.

The Desugaring Approach

In Session 150, the pipeline operator was implemented by desugaring to function calls at parse time. No new AST node. No new type checker rules. No new bytecode.

This approach was then applied to other features. While-let desugars to a loop with a pattern check. Or-patterns desugar to a series of checks with a shared body. Break with value desugars to a store-and-jump sequence.

The desugaring philosophy kept the compiler core small. The parser handles complexity; the downstream passes handle simplicity.

The Coherence Test

A type system is coherent if its features interact predictably. Here are several interactions that work correctly because of deliberate design:

Generic + Union + Pattern Matching

flinenum Result<T, E> {
    Ok(T),
    Err(E)
}

fn handle<T: Printable, E: Printable>(result: Result<T, E>) -> text {
    match result {
        Ok(value) -> "Success: " + value.to_text()
        Err(error) -> "Error: " + error.to_text()
    }
}

This uses generics, trait bounds, tagged unions, and pattern matching simultaneously. The compiler: 1. Resolves the generic parameters T and E 2. Validates the Printable bounds on both 3. Checks the match is exhaustive (Ok and Err cover all variants) 4. Narrows T inside the Ok arm and E inside the Err arm 5. Verifies that .to_text() is available (via the Printable bound)

Five features interacting correctly.

Pipeline + Destructuring + Type Guards

flindata: [int | text] = [1, "hello", 2, "world", 3]

numbers = data
    |> filter(x => x is int)
    |> map(x => x * 2)

[first, second, ...rest] = numbers

Pipeline feeds data through transformations. Type guard (is int) narrows the union type in the filter. Destructuring unpacks the result. The type of first is int -- the compiler traced the type through three features.

While-Let + Tagged Union + Break Value

flinenum Token {
    Number(int),
    Text(text),
    End
}

fn find_first_number(tokens: [Token]) -> int? {
    index = 0
    while let token = tokens[index] {
        match token {
            Number(n) -> break n
            Text(_) -> { index++; continue }
            End -> break
        }
        index++
    }
}

While-let iterates. Pattern matching dispatches on the token variant. Break with value returns the found number. The result type is int? because the loop might not find a number.

Labeled Loop + Or-Pattern + Exhaustiveness

flinenum Priority { Critical, High, Medium, Low }

'scan: for task in tasks {
    match task.priority {
        Critical | High -> {
            urgent_tasks.push(task)
            if urgent_tasks.len >= max {
                break 'scan
            }
        }
        Medium | Low -> continue
    }
}

Or-patterns combine priority levels. Labeled break exits when enough urgent tasks are found. The match is exhaustive because Critical | High and Medium | Low cover all four variants.

The Implementation Numbers

At the end of Session 157, FLIN's type system implementation comprised:

ComponentApproximate Lines
FlinType enum and operations800
Type inference engine1,200
Type compatibility checker600
Pattern matching type checking500
Exhaustiveness checker400
Trait registry and validation500
Generic type substitution300
Error message generation400
Total type system~4,700

The test suite grew proportionally:

CategoryTest Count
Type inference tests~200
Type compatibility tests~150
Pattern matching tests~100
Generic type tests~80
Trait bound tests~50
Exhaustiveness tests~40
Integration (end-to-end) tests~450
Total~1,070 type-system-related tests

Every feature was tested at the unit level (parser, type checker, code gen independently) and at the integration level (full compile-and-run tests).

What We Would Do Differently

No design survives contact with reality perfectly. A few things we would reconsider:

Type aliases could be more powerful. FLIN's type Option<T> = T? is a simple alias. It does not create a new nominal type. This means Option<int> and int? are interchangeable -- which is convenient but loses the semantic distinction. A future version might support nominal type aliases that create truly distinct types.

Trait composition could use + in more places. Currently, T: A + B works in bounds, but you cannot write type Combined = A + B to combine traits. This is a gap that occasionally requires workarounds.

Error recovery in the type checker could be better. When the type checker encounters an error, it sometimes stops checking subsequent expressions in the same block. Continuing past errors and reporting multiple diagnostics would give developers a more complete picture.

Incremental type checking is not implemented. Every change re-checks the entire file. For small files, this is instant. For large programs, it could become a bottleneck. Incremental checking -- only re-checking expressions affected by a change -- is a future optimization.

What We Got Right

Several early decisions proved exactly right:

Bidirectional inference. Forward inference (value determines type) handles 90% of cases. Backward inference (context determines type) handles the remaining 10%. Together, they eliminate nearly all explicit type annotations.

The type hierarchy with int < number. Having int as a subtype of number means arithmetic works naturally. Adding an int to a number produces a number. No explicit casts needed.

Nominal traits over structural interfaces. Explicit impl blocks make trait relationships visible and searchable. Error messages can name the specific trait that is missing. The small cost (writing impl blocks) pays for itself in clarity.

Exhaustiveness as an error, not a warning. Making non-exhaustive matches a hard error catches bugs at compile time that would otherwise surface in production. Every developer who adds a variant to an enum is guided by the compiler to update every match.

Desugar at the parser level. Pipeline operators, while-let, and or-patterns all desugar to simpler constructs before the type checker sees them. This keeps the type checker focused on fundamental type operations and avoids feature-specific special cases.

The Philosophy in Retrospect

Looking back over the entire type system arc, a philosophy emerges: make the type system invisible when it can be, and visible when it must be.

Invisible: type inference means developers rarely write type annotations. Automatic coercion means int-to-number conversion just works. Optional propagation means null safety does not require ceremony.

Visible: union types explicitly declare what a value can be. Trait bounds explicitly declare what a generic type must support. Exhaustiveness checking explicitly requires handling every case. Error messages explicitly state what went wrong and how to fix it.

The balance point is different for different features. Inference should be invisible -- the developer should not think about types for most code. Exhaustiveness should be visible -- the developer should know that they handled every case.

This balance is what makes FLIN's type system suitable for application developers. It does not demand the level of type annotation that Rust or Haskell demand. It does not accept the level of type uncertainty that JavaScript or Python accept. It sits in a middle ground where types are present but unobtrusive, where safety is enforced but not burdensome.

That middle ground was the goal from the first session. One hundred and fifty sessions later, we achieved it.

What Comes Next

The type system arc is complete. The next arc of the "How We Built FLIN" series moves to a different domain: FLIN's temporal model. Time travel queries, entity history, the @ operator, temporal keywords, and the database infrastructure that makes "show me this record as it was last Tuesday" a one-line operation.

The type system will reappear throughout -- temporal operations return typed results, history queries produce typed lists, and the @ operator is type-checked like any other expression. But the focus shifts from how FLIN understands types to how FLIN understands time.


This is Part 45 of the "How We Built FLIN" series, documenting how a CEO in Abidjan and an AI CTO designed and implemented a programming language from scratch.

Series Navigation: - [43] While-Let Loops and Break With Value - [44] Labeled Loops and Or-Patterns - [45] Advanced Type Features: The Complete Picture (you are here) - [46] FLIN's Temporal Model (coming next)

Share this article:

Responses

Write a response
0/2000
Loading responses...

Related Articles