Session 097 began with a decision that set the tone for the entire destructuring implementation: do we modify the existing VarDecl statement, or do we create a new statement type?
The answer shaped not just the feature, but the development methodology. We chose a separate statement type, a stub-first implementation approach, and an incremental rollout across two sessions. The result was a destructuring system that handles arrays, entities, nested patterns, rest collection, and default values -- without breaking a single existing test.
The Problem Destructuring Solves
Without destructuring, extracting values from structured data requires verbose, repetitive code:
flinpoint = [10, 20, 30]
x = point[0]
y = point[1]
z = point[2]
user = getUser()
name = user.name
email = user.email
role = user.roleThree lines to unpack a list. Three lines to extract entity fields. In a language that values simplicity and expressiveness, this verbosity is a design failure.
With destructuring:
flin[x, y, z] = [10, 20, 30]
{ name, email, role } = getUser()Two lines instead of six. The structure of the left side mirrors the structure of the right side, and the compiler handles the extraction.
The Design Decision: Separate Statement Type
FLIN's AST had Stmt::VarDecl used in approximately 190 places across the codebase. Modifying it to support pattern-based left-hand sides would have required touching every one of those 190 places -- in the parser, type checker, code generator, formatter, and tests.
Instead, we created Stmt::DestructuringDecl:
rustStmt::DestructuringDecl {
pattern: Pattern,
mutability: Mutability,
value: Expr,
span: Span,
}This coexists with VarDecl. Simple variable declarations continue to use VarDecl exactly as before. Destructuring declarations use the new statement type. Zero changes to existing code paths.
The trade-off is a slight increase in AST complexity -- two statement variants where one might suffice. But in practice, this separation keeps each variant simple and its handling focused. The parser knows which variant to produce. The type checker knows which variant to check. Neither has to branch on "is this a simple variable or a pattern?"
The Pattern Enum
The core of destructuring is the Pattern enum:
rustpub enum Pattern {
Identifier { name: String, span: Span },
List { patterns: Vec<Pattern>, span: Span },
Rest { name: String, span: Span },
Map { entries: Vec<(String, Pattern)>, span: Span },
WithDefault { pattern: Box<Pattern>, default: Expr, span: Span },
}Five variants, each composable with the others:
Identifier -- the simplest pattern. Binds the value to a name.
List -- matches an ordered collection and destructures by position.
Rest -- collects remaining elements into a list. Prefixed with ....
Map -- matches a key-value collection (or entity) and destructures by name.
WithDefault -- wraps any pattern with a fallback value.
These compose recursively. A list pattern can contain identifiers, other list patterns, rest patterns, and patterns with defaults:
flin// Nested destructuring
[first, [inner_a, inner_b], ...rest] = [[1, 2], [3, 4], [5, 6], [7, 8]]
// With defaults
[x, y, z = 0] = [10, 20]
// x = 10, y = 20, z = 0 (default)The Stub-First Approach
Session 097 did not implement destructuring in one pass. Instead, it used a stub-first approach:
- Phase 1: Pattern Enum. Add the
Patternenum to the AST. AddDisplayimplementations for debugging. - Phase 2: DestructuringDecl Statement. Add the new statement type. Add stub handlers in every compiler pass.
- Phase 3: Parser. Implement pattern parsing and detection of destructuring assignments.
- Phase 4: Code Generation. Implement the bytecode emission for pattern destructuring.
- Phase 5: Tests. Add comprehensive tests for all pattern types.
The critical property: after each phase, the entire test suite passes. Phase 1 adds types that nothing uses yet -- no tests break. Phase 2 adds stubs that compile but do nothing -- no tests break. Phases 3-5 add new functionality with new tests -- existing tests still pass.
This approach is worth describing because it is the methodology that made FLIN's development sustainable across 150+ sessions. Every change is additive. Nothing breaks. The compiler is always in a working state.
Type Checker Support
The type checker handles destructuring by walking the pattern tree and binding each identifier to its inferred type:
rustfn check_pattern(&mut self, pattern: &Pattern, value_type: &FlinType, span: Span) {
match pattern {
Pattern::Identifier { name, .. } => {
self.env.insert(name.clone(), value_type.clone());
}
Pattern::List { patterns, .. } => {
let elem_type = match value_type {
FlinType::List(inner) => inner.as_ref().clone(),
_ => {
self.report_error("cannot destructure non-list as list");
FlinType::Unknown
}
};
for (i, pat) in patterns.iter().enumerate() {
match pat {
Pattern::Rest { name, .. } => {
// Rest collects remaining elements into a list
self.env.insert(name.clone(), FlinType::List(Box::new(elem_type.clone())));
}
_ => {
self.check_pattern(pat, &elem_type, span);
}
}
}
}
Pattern::Map { entries, .. } => {
for (key, pat) in entries {
let field_type = match value_type {
FlinType::Entity(name) => self.get_entity_field_type(name, key),
FlinType::Map(_, v) => v.as_ref().clone(),
_ => {
self.report_error("cannot destructure as map/entity");
FlinType::Unknown
}
};
self.check_pattern(pat, &field_type, span);
}
}
Pattern::WithDefault { pattern, default, .. } => {
let default_type = self.infer_type(default);
// The pattern's type is the union of the value type and default type
self.check_pattern(pattern, value_type, span);
}
}
}The recursive structure mirrors the Pattern enum. List patterns extract the element type from the list type. Map patterns look up field types from entity definitions. Rest patterns bind a list type. Default patterns are checked with the value type but fall back to the default's type at runtime.
Code Generation
The bytecode emitter generates index-based access for list destructuring and field-based access for map/entity destructuring:
rustfn emit_destructuring_pattern(&mut self, pattern: &Pattern, source_local: usize) {
match pattern {
Pattern::Identifier { name, .. } => {
self.emit_load(source_local);
let local = self.declare_local(name);
self.emit_store(local);
}
Pattern::List { patterns, .. } => {
for (i, pat) in patterns.iter().enumerate() {
match pat {
Pattern::Rest { name, .. } => {
// Slice from index i to end
self.emit_load(source_local);
self.emit_const(Value::Int(i as i64));
self.emit_op(OpCode::SliceFrom);
let local = self.declare_local(name);
self.emit_store(local);
}
_ => {
// Index access
self.emit_load(source_local);
self.emit_const(Value::Int(i as i64));
self.emit_op(OpCode::Index);
let temp = self.allocate_temp();
self.emit_store(temp);
self.emit_destructuring_pattern(pat, temp);
}
}
}
}
Pattern::Map { entries, .. } => {
for (key, pat) in entries {
self.emit_load(source_local);
self.emit_const(Value::Text(key.clone()));
self.emit_op(OpCode::GetField);
let temp = self.allocate_temp();
self.emit_store(temp);
self.emit_destructuring_pattern(pat, temp);
}
}
Pattern::WithDefault { pattern, default, .. } => {
// Check if value is none, use default if so
self.emit_load(source_local);
self.emit_op(OpCode::Dup);
let jump = self.emit_jump_if_not_none();
self.emit_op(OpCode::Pop);
self.emit_expr(default);
let temp = self.allocate_temp();
self.emit_store(temp);
self.patch_jump(jump);
self.emit_destructuring_pattern(pattern, temp);
}
}
}List destructuring compiles to a sequence of index operations. Element 0 goes to the first pattern, element 1 to the second, and so on. Rest patterns use a slice operation to capture everything from the current index to the end.
Map destructuring compiles to a sequence of field access operations. Each key is used to look up the corresponding value, which is then bound to the pattern.
Default patterns compile to a conditional: check if the value is none, and if so, evaluate and use the default expression.
Array Destructuring in Detail
flin// Basic
[a, b, c] = [1, 2, 3]
// a = 1, b = 2, c = 3
// With rest
[first, ...rest] = [1, 2, 3, 4, 5]
// first = 1, rest = [2, 3, 4, 5]
// With defaults
[x, y, z = 0] = [10, 20]
// x = 10, y = 20, z = 0
// Nested
[a, [b, c]] = [1, [2, 3]]
// a = 1, b = 2, c = 3
// Skipping elements
[_, _, third] = [10, 20, 30]
// third = 30The wildcard pattern _ discards a value without binding it. This is the same _ used in match expressions -- a universal "I do not care about this value" marker.
Entity Destructuring
flinentity Point {
x: int
y: int
z: int = 0
}
point = Point { x: 10, y: 20, z: 30 }
// Extract fields by name
{ x, y } = point
// x = 10, y = 20
// Rename during extraction
{ x: horizontal, y: vertical } = point
// horizontal = 10, vertical = 20
// With rest (collects remaining fields)
{ x, ...other } = point
// x = 10, other = { y: 20, z: 30 }Entity destructuring uses field names rather than positions. This is more robust than array destructuring -- if the entity adds a new field, existing destructuring patterns continue to work.
For Loop Destructuring
Destructuring extends naturally to for loops:
flinpoints = [[1, 2], [3, 4], [5, 6]]
for [x, y] in points {
print("x: " + text(x) + " y: " + text(y))
}Each iteration destructures the current element. The pattern [x, y] is applied to each element of points, binding x and y for the loop body.
Function Parameter Destructuring
Functions can destructure their parameters:
flinfn distance([x1, y1], [x2, y2]) -> number {
dx = x2 - x1
dy = y2 - y1
return (dx * dx + dy * dy) ** 0.5
}
distance([0, 0], [3, 4]) // 5.0The function takes two list arguments and destructures them in the parameter list. The caller passes lists; the function body works with named values.
The Elvis Operator: A Quick Win
Session 097 also implemented the Elvis operator (?:) alongside the destructuring foundation. This was a deliberate scheduling decision -- the Elvis operator was a self-contained, 45-minute feature that could be completed and shipped while the destructuring foundation was being laid.
flinname = user.name ?: "Anonymous"
displayName = firstName ?: lastName ?: "Guest"The Elvis operator returns the first truthy value. It differs from nullish coalescing (??) in that it treats falsy values (0, false, empty string) as absent:
| Expression | `??` Result | `?:` Result |
|---|---|---|
0 ?? 42 | 0 | 42 |
false ?? true | false | true |
"" ?? "default" | "" | "default" |
none ?? "default" | "default" | "default" |
The implementation reused the Or operator's bytecode -- both short-circuit on the first truthy value. The only difference is the AST representation and the semantic distinction for developers.
Session Statistics
Session 097 produced:
- 4 commits, approximately 710 lines
- 2 new tests (Elvis lexer and parser tests)
- 16 files modified
- All 1,017 existing tests passing throughout
The destructuring implementation was 70% complete at session end -- the pattern enum, the statement type, and all stubs were in place. The remaining 30% (parser integration and full code generation) was completed in Session 098.
Why Destructuring Matters
Destructuring is syntactic sugar. Everything it does can be accomplished with index access and field access. But syntactic sugar matters enormously for developer experience.
Consider a function that processes a list of coordinate pairs:
flin// Without destructuring
for point in points {
x = point[0]
y = point[1]
distance = (x * x + y * y) ** 0.5
print(distance)
}
// With destructuring
for [x, y] in points {
distance = (x * x + y * y) ** 0.5
print(distance)
}The destructuring version is not just shorter. It is clearer. The pattern [x, y] declares the expected shape of each point. If a point has three elements, the developer knows to use [x, y, z]. The pattern is documentation.
This is why we titled the article "Destructuring Everywhere" -- the feature appears in variable declarations, for loops, function parameters, and match arms. Every place where a value is bound to a name, destructuring is available. It is not a special case. It is the general case.
The next article covers the pipeline operator -- another ergonomic feature that transforms how developers compose operations in FLIN.
This is Part 37 of the "How We Built FLIN" series, documenting how a CEO in Abidjan and an AI CTO designed and implemented a programming language from scratch.
Series Navigation: - [35] Pattern Matching: From Switch to Match - [36] Tagged Unions and Algebraic Data Types - [37] Destructuring Everywhere (you are here) - [38] The Pipeline Operator: Functional Composition in FLIN - [39] Tuples, Enums, and Structs