Rust Is a Lisp in Denial
The language that promised zero-cost abstractions is speed-running C++’s complexity trajectory while quietly rebuilding 1958
Rust’s macro system is homoiconicity laundered through three layers of indirection. McCarthy figured this out in 1958. We spent fifteen years rebuilding it while insisting we were doing something different.
A stripped Rust “hello world” binary is 23x the size of its C equivalent.
That number is annoying on a desktop. It’s a rounding error on a server. On a Jetson Orin NX with 8 GiB of unified memory shared between CPU and GPU, it is a direct subtraction from the inference capacity that is the entire reason the hardware exists. Every byte your system daemons consume is a byte your model cannot use.
I consult on embedded and edge AI systems. One client calculated their lifetime cost per megabyte of RAM across a deployed fleet at roughly $1M. Another is staring at the Orin NX 8GB-to-16GB pricing gap and realizing that a bloated system stack (around 2 GiB per daemon when ported to Rust) could push the entire fleet to the more expensive SKU. The Rust evangelists will tell you about #![no_std] and cargo-bloat and LTO. These are treatments, not cures. LTO can trim dead code. It cannot un-monomorphize live code.
This is where the “no runtime” claim collapses. Rust doesn’t have a runtime in the Go or Java sense. What it has is worse: the runtime is laced throughout the binaries. (That’s a client’s phrasing, not mine, and it’s more precise than anything I’d have written.) Monomorphization stamps out a separate copy of every generic function for every concrete type. Panic infrastructure gets baked in. The allocator comes along for the ride. None of this is a runtime you can point to and say “there it is.” It’s load-bearing fat marbled through every compilation unit.
On memory-constrained targets, binary size and memory footprint are safety properties. Rust trades one class of safety (memory corruption) for another class of unsafety (resource exhaustion). The Rust community treats this as a niche complaint. It isn’t. It’s the entire embedded and edge computing market.
Monomorphization as destiny
Monomorphization was a performance decision. Reasonable people can look at the tradeoff space and conclude that for hot-path server code, stamping out specialized copies is worth the binary size. The problem is that Rust made it the only choice.
Swift ships witness tables. The compiler can choose dynamic dispatch for generics, paying a small runtime cost for dramatically smaller binaries. This is not theoretical. It’s shipping on billions of devices. Rust could have made monomorphization a deployment-time decision (a build profile, a per-crate annotation, a DSL for specifying compilation strategy). Instead it became a language-level default that library authors bake into their APIs, and now it’s load-bearing in the ecosystem. Serde, the de facto serialization framework that virtually every nontrivial Rust project depends on, is monomorphization all the way down.
Whether Rust’s governance structure could even make this change at this point is an open question. The answer is probably no. The ecosystem has calcified around the assumption.
The syntax problem is a people problem
Rust’s designers were C++ people. This is not an insult. It’s a diagnosis.
They inherited C++’s worst syntactic habits because they’d internalized them to the point of invisibility. Turbofish (::<>) makes perfect sense if you’ve already spent a decade reading C++ template syntax. The 'a lifetime annotation looks exactly like a character literal prefix in half a dozen other languages. These aren’t bugs. They’re the residue of a design process where the authors couldn’t see from outside their own expertise.
The trait system grafts Haskell’s typeclasses onto an imperative systems language. The borrowed concepts aren’t the problem. The interaction complexity is. You get trait bounds, associated types, higher-ranked trait bounds, impl Trait in argument position vs return position (with different semantics), and where clauses that can run longer than the function body they constrain. If I want Haskell, I know where to find it. My clients’ engineers know C, Python, some Go. They don’t know Haskell, and the Rust community’s answer (”the learning curve is worth it”) is not a counterargument. It’s a confession that the language demands a mass retraining event for a workforce that doesn’t exist yet.
The Lisp in the room
Here is where it gets interesting.
Rust’s macro system started with macro_rules!, a pattern-matching system that hit its expressiveness ceiling almost immediately. The community responded with proc_macros, which operate on TokenStream objects (tree-structured symbolic representations of code). Then came syn (a full parser for Rust syntax into an AST) and quote (quasi-quoting, the ability to write code templates with interpolated values). If you have used Lisp, you recognize every single one of these moves.
proc_macro is code-as-data. syn is a reader. quote is quasiquote. The Rust macro ecosystem is homoiconicity laundered through three layers of indirection, each one invented because the previous layer wasn’t expressive enough, each one moving closer to what McCarthy had working in 1958.
The community built this and then didn’t name it. Nobody said “we needed homoiconicity and refused to admit it.” They said “proc_macros are powerful” and “the macro system is complex” and wrote 400-page books about how to use the tooling they’d built to work around the fact that Rust’s surface syntax is hostile to metaprogramming. Lisp’s syntax is not hostile to metaprogramming because Lisp’s syntax is metaprogramming. The parentheses aren’t a price you pay. They’re the point.
Curly braces signal “serious systems language” to hiring managers and VPs of Engineering. The syntax choice is marketing, not engineering. It signals membership in the C lineage, which signals employability, which signals ecosystem, which signals adoption. None of these are technical arguments. All of them won.
What C actually has now
The counterargument I take seriously: C’s unsafety at scale is not theoretical. Real CVEs, real exploits, real cost. “Just lint better” has been the C community’s answer for 40 years and the CVEs keep coming.
But “lint better” in 2025 is not what it was in 2005. GCC’s -fanalyzer catches use-after-free and null dereference at compile time. Clang-tidy ships MISRA and CERT profiles. Coverity and Infer do interprocedural analysis that would have been science fiction a decade ago (one client uses Infer across their entire codebase and calls it “amazing”). AddressSanitizer, UndefinedBehaviorSanitizer, and MemorySanitizer catch at runtime what static analysis misses. None of this is as strong a guarantee as Rust’s borrow checker. All of it is deployable today, on existing codebases, with existing engineers, at existing binary sizes.
The question is not “is Rust safer than C?” It is. The question is whether the delta in safety justifies the delta in binary size, memory footprint, ecosystem complexity, workforce availability, and compilation strategy inflexibility for your specific target. On a server with 256 GiB of RAM, the answer is probably yes. On a Jetson with 8 GiB of unified memory where every megabyte has a dollar cost measured in fleet-wide SKU decisions, the answer might be no.
The endgame
In three to five years someone is going to prompt a frontier model: “Design a systems programming language with Rust’s ownership semantics, configurable compilation strategy (monomorphization as opt-in, not default), and s-expression syntax.”
The model will produce something usable in an afternoon. S-expressions are trivially parseable, trivially generatable, trivially transformable. They are the most AI-friendly syntax possible, because they are the most machine-friendly syntax possible, because McCarthy designed them for machines in the first place and then people discovered you could program in them too.
That language will have Rust’s safety guarantees without Rust’s binary bloat. It will have metaprogramming without the proc_macro Rube Goldberg machine. It will look weird to people who think curly braces are load-bearing. It will compile to smaller binaries than Rust on every target that matters for edge deployment.
And when it ships, someone will look at the Rust ecosystem’s fifteen-year journey from macro_rules! to syn+quote and recognize it for what it was: the longest, most expensive proof that we needed homoiconicity and refused to admit it.

Greenspun’s Tenth Rule remains undefeated.