Why is Zig so cool?
nilostolte.github.io555 points by vitalnodo 3 days ago
555 points by vitalnodo 3 days ago
> I can’t think of any other language in my 45 years long career that surprised more than Zig.
I can say the same (although my career spans only 30 years), or, more accurately, that it's one of the few languages that surprised me most.
Coming to it from a language design perspective, what surprised me is just how far partial evaluation can be taken. While strictly weaker than AST macros in expressive power (macros are "referentially opaque" and therefore more powerful than a referentially transparent partial evaluation - e.g. partial evaluation has no access to an argument's name), it turns out that it's powerful enough to replace not only most "reasonable" uses of macros, but also generics and interfaces. What gives Zig's partial evaluation (comptime) this power is its access to reflection.
Even when combined with reflection, partial evaluation is more pleasurable to work with than macros. In fact, to understand the program's semantics, partial evaluation can be ignored altogether (as it doesn't affect the meaning of computations). I.e. the semantics of a Zig program are the same as if it were interpreted by some language Zig' that is able to run all of Zig's partial-evaluation code (comptime) at runtime rather than at compile time.
Since it also removes the need for other specialised features (generics, interfaces) - even at the cost of an aesthetic that may not appeal to fans of those specialised features - it ends up creating a very expressive, yet surprisingly simple and easy-to-understand language (Lisps are also simple and expressive, but the use of macros makes understanding a Lisp program less easy).
Being simple and easy to understand makes code reviews easier, which may have a positive impact on correctness. The simplicity can also reduce compilation time, which may also have a positive impact on correctness.
Zig's insistence on explicitness - no overloading, no hidden control flow - which also assists reviews, may not be appropriate for a high-level language, but it's a great fit for an unabashedly low-level language, where being able to see every operation as explicit code "on the page" is important. While its designer may or may not admit this, I think Zig abandons C++'s belief that programs of all sizes and kinds will be written in the same language (hence its "zero-cost abstractions", made to give the illusion of a high-level language without its actual high-level abstraction). Developers writing low-level code lose the explicitness they need for review, while those writing high-level programs don't actually gain the level of abstraction required for a smooth program evolution that they need. That belief may have been reasonable in the eighties, but I think it has since been convincingly disproved.
Some Zig decisions surprised me in a way that made me go more "huh" than "wow", such as it having little encapsulation to speak of. In a high-level language I wouldn't have that (after years of experience with Java's wide ecosystem of libraries, we learned that we need even more and stronger encapsulation than we originally had to keep compatibility while evolving code). But perhaps this is the right choice for a low-level language where programs are expected to be smaller and with fewer dependencies (certainly shallower dependency graphs). I'm curious to see how this pans out.
Zig's terrific support for arenas also makes one of the most powerful low-level memory management techniques (that, like a tracing garbage collector, gives the developer a knob to trade off RAM usage for CPU) very accessible.
I have no idea or prediction on whether Zig will become popular, but it's certainly fascinating. And, being so remarkably easy to learn (especially if you're familiar with low-level programming), it costs little effort to give it a try.
Well put. The majority of language development for the last 20 years has proceeded by adding more features into languages, as they all borrow keywords and execution semantics from each other. It's like a neighborhood version of corporate bureaucracies, where each looks across the street, and decides "they've got a department we don't have, we better add one of those".
I like languages that dare to try to do more with less. Zig's comptime, especially the way it supplants generics, is pretty darn awesome.
I was having a similar feeling with Elixir the other day, when I realized that I could built every single standard IPC mechanism that you might find in something like python.threading (Queue, Mutex, RecursionLock, Condition, Barrier, etc) with the Erlang/Beam/Process mailbox.
Great comment! I agree about comptime, as a Rust programmer I consider it one of the areas where Zig is clearly better than Rust with its two macro systems and the declarative generics language. It's probably the biggest "killer feature" of the language.
> as a Rust programmer I consider it one of the areas where Zig is clearly better than Rust with its two macro systems and the declarative generics language
IMHO "clearly better" might be a matter of perspective; my impression is that this is one of those things where the different approaches buy you different tradeoffs. For example, by my understanding Rust's generics allows generic functions to be completely typechecked in isolation at the definition site, whereas Zig's comptime is more like C++ templates in that type checking can only be completed upon instantiation. I believe the capabilities of Rust's macros aren't quite the same as those for Zig's comptime - Rust's macros operate on syntax, so they can pull off transformations (e.g., #[derive], completely different syntax, etc.) that Zig's comptime can't (though that's not to say that Zig doesn't have its own solutions).
Of course, different people can and will disagree on which tradeoff is more worth it. There's certainly appeal on both sides here.
Consider that Python + C++ has proved to be a very strong combo: driver in Python, heavy lifting in C++.
It's possible that something similar might be the right path for metaprogramming. Rust's generics are simple and weaker than Zig's comptime, while proc macros are complicated and stronger than Zig's comptime.
So I think the jury's still out on whether Rust's metaprogramming is "better" than Zig's.
This is the real answer (amongst other goodness) - this one is well executed and differentiated
Every language at scale needs a preprocessor (look at the “use server” and “use gpu” silliness happening in TS) - why is it not the the same as the language you use?
Languages such as D and Nim (both greatly underappreciated) offer full-language compile-time interpretation.
I agree.
I look forward to a future high-level language that uses something like comptime for metaprogramming/interfaces/etc, is strongly typed, but lets you write scripts as easily as python or javascript.
Tryout Nim, it has powerful comptime/metaprogramming, statically typed, automatic memory management and is as easy to program as python or javascript while still allowing low level stuff.
For me it'd be hard to go back to languages that don't have all that. Only swift comes close.
D comes close ... it too has a full-language comptime interpreter and other metaprgramming features (though not as rich as Nim's), statically typed, optional garbage collection, and you can write
#!/usr/bin/env rdmd
[D code]
and run it as if it were an executable. (The compilation is cached so it runs just as fast on subsequent runs.)
Thing is, having a good JIT gives you the performance of partial evaluation pretty much automatically (at the cost of less predictability), as compilation occurs at runtime, so the distinction between compile-time and runtime largely disappears. E.g., in Java, a reflective call will eventually be compiled by the JIT into a direct call; virtual dispatch will also be compiled into direct dispatch or even inlined (when appropriate) etc..
D and Nim both offer that. D has a tool, rdmd, that compiles (with caching) and runs a script written in D, so you write
#!/usr/bin/env rdmd D code ...
and run it as if it were an executable.
If you want to write a code example on HN you can just indent it by 2 spaces and it'll work like you'd expect. For example:
#!/usr/bin/env rdmd
D code...Thanks. I didn't catch that it didn't display correctly until it was too late to edit it.
> hence its "zero-cost abstractions", made to give the illusion of a high-level language without its actual high-level abstraction
What does this mean?
For example (you can pick another example if you want), how is C++'s std::vector less abstract than Java's ArrayList?
Because std::vector isn't much of an abstraction, in the sense of removing a set of concerns from consideration. v[i] is just pointer math. What happens if you index outside the bounds of v is anybody's guess, and it can fail silently. You could use v.at(i), but then somebody will yell at you for using exceptions. Regardless of where you stand on C++ exceptions, the fact that it's up for debate means that it will get debated. The cost of zero-cost abstractions in C++ is quite high.
> Developers writing low-level code lose the explicitness they need for review, while those writing high-level programs don't actually gain the level of abstraction required for a smooth program evolution that they need.
I've described this in the past as languages being "too general purpose" or too "multi-paradigm". Languages like Scala that try to be Haskell and Java in one.
> I have no idea or prediction on whether Zig will become popular
I think LLMs may be able to assist to move large C codebases to Zig in the next decade. Once zigc compiles C-Linux, bit-by-bit can be (LLM-assistedly) ported to Zig. This it not soon, but I think will be it's killer feature.
I don't mind if Linux becomes Rust+Zig codebase in, say, 10y from now. :)
A neat little thing I like about Zig is one of the options for installing it is via PyPI like this: https://pypi.org/project/ziglang/
pip install ziglang
Which means you don't even have to install it separately to try it out via uvx. If you have uv installed already try this: cd /tmp
echo '#include <stdio.h>
int main() {
printf("Hello, World!");
return 0;
}' > hello.c
uvx --from ziglang python-zig cc /tmp/hello.c
./a.outFor anyone not familiar: You can bundle arbitrary software as Python wheels. Can be convenient in cases like this!
What "cases" are those? Tell me one useful and neat case. Why is it useful and neat, you think?
I believe it is used to cross platform link Rust/maturin wheels, which seems nice because it's one fewer unusual install script to integrate into your project, if zig isn't packaged for Debian yet.
For one example, a number of years back, I built a python package, env, and version manager. It was built entirely Rust and distributed as a binary. Since I know users would likely have pip installed, it provided an easy way for them to install, regardless of OS.
You could go further like in this case, and use wheels + PyPi for something unrelated to Python.
It's useful as a distro-agnostic distribution method. CMake is also installable like this despite having nothing to do with Python.
Or I should say it was useful as a distribution method, because most people had Python already available. Since most distros now don't allow you to install stuff outside a venv you need uv to install things (via `uv tool install`) and we're not yet at the point where most people already have uv installed.
For this sort of stuff I find micromamba / pixi a better way of managing packages, as oppposed to the pip / uv family of tools
Pixi, Conan, or Nix— all better choices than abusing the Python ecosystem to ship arbitrary executables.
It could easily be the case that the zig compiler is useful in some mixed-language project and this is not actually "abuse".
Regular Python bindings / c extensions don’t depend on a pypi-packaged instance of gcc or llvm though. It’s understood that these things are provided externally from the “system” environment.
I know some of it has already happened with rust, but perhaps there’s a broader reckoning that needs to occur here wrt standards around how language specific build and packaging systems handle cross language projects… which could well point to phasing those in favour of nix or pixi, which are designed from the getgo to support this use case.
What do those systems do that UV/PyPi doesn't?
Usually arbitrary binaries stuffed in Python wheels are mostly self contained single binaries and such, with as little dynamic linking nonsense as possible, so they don't break all the time, or have dependency conflicts.
It seems to consistently work really well for binaries, although it would be nice to have first class support for integrating npm packages.
reinventing nix but worse.
That's really cool actually. Now that AI is a little more commonly available for developer tooling I feel like its easier than ever to learn any programming language since you can braindrain the model.
The standard models are pretty bad a zig right now since the language is so new and changes so fast. The entire language spec is available in one html file though so you can have a little better success feeding that for context.
> The entire language spec is available in one html file though so you can have a little better success feeding that for context.
This is what I've started doing for every library I use. I go to their Github, download their docs, and drop the whole thing into my project. Then whenever the AI gets confused, I say "consult docs/somelib/"
Just use gh_grep mcp and the model will fetch what it needs if you tell it to, no need to download from GitHub manually like this
I on the other hand see most languages become superfluous, as coding agents keep improving.
During the last year I have been observing how MCP, tools and agents, have reduced the amount of language specific code we used to write.
I'm afraid this article kinda fails at at its job. It starts out with a very bold claim ("Zig is not only a new programming language, but it’s a totally new way to write programs"), but ends up listing a bunch of features that are not unique to Zig or even introduced by Zig: type inference (Invented in the late 60s, first practically implemented in the 80s), anonymous structs (C#, Go, Typescript, many ML-style languages), labeled breaks, functions that are not globally public by default...
It seems like this is written from the perspective of C/C++ and Java and perhaps a couple of traditional (dynamically typed) languages.
On the other hand, the concept that makes Zig really unique (comptime) is not touched upon at all. I would argue compile-time evaluation is not entirely new (you can look at Lisp macros back in the 60s), but the way Zig implements this feature and how it is used instead of generics is interesting enough to make Zig unique. I still feel like the claim is a bit hyperbolic, but there is a story that you can sell about Zig being unique. I wanted to read this story, but I feel like this is not it.
D has had compile time function execution since 2007 or so.
https://dlang.org/spec/function.html#interpretation
It doesn't need a keyword to trigger it. Any expression that is a const-expression in the grammar triggers it.
Hello Mr. Bright. I've seen similar comments from you in response to Zig before. Specifically, in the comments on blog post I made about Zig's comptime. I took some time reading D's documentation to try to understand your point (I didn't want to miss some prior art, after all). By the time I felt like I could give a reply, the thread was days old, so I didn't bother.
The parent comment acknowledges that compile time execution is not new. There is little in Zig that is, broad strokes, entirely new. It is in the specifics of the design that I find Zig's ergonomics to be differentiated. It is my understanding that D's compile time function execution is significantly different from Zig's comptime.
Mostly, this is in what Zig doesn't have as a specific feature, but uses comptime for. For generics, D has templates, Zig has functions which take types and return types. D has conditional compilation (version keyword), while Zig just has if statements. D has template mixins, Zig trusts comptime to have 90% of the power for 10% of the headache. The power of comptime is commonly demonstrated, but I find the limitations to be just as important.
A difference I am uncertain about is if there's any D equivalent for Zig having types being expressions. You can, for example, calculate what the return type should be given a type of an argument.
Is this a fair assessment?
> A difference I am uncertain about is if there's any D equivalent for Zig having types being expressions. You can, for example, calculate what the return type should be given a type of an argument.
This is done in D using templates. For example, to turn a type T into a type T star:
template toPtr(T) { alias toPtr = T*; } // define template
toPtr!int p; // instantiate template
pragma(msg, "the type of p is: ", typeof(p));
The compiler will deduce the correct return type for a function by specifying auto* as the return type: auto toPtr(int i) { return cast(float)i; } // returns float
For conditional compilation at compile time, D has static if: enum x = square(3); // evaluated at compile time
static if (x == 4)
int j;
else
double j;
auto k = k;
Note that the static if* does not introduce a new scope, so conditional declarations will work.The version is similar, but is intended for module-wide versions, such as:
version (OSX)
{ stuff for OSX }
else version (Win64)
{ stuff for Windows 64 }
else
static assert(0, "unsupported OS");
Compile time execution is triggered wherever a const-expression is required. A keyword would be redundant.D's mixins are for generating code, which is D's answer to general purpose text macros. Running code at compile time enables those strings to be generated. The mixins and compile time execution are not the same feature. For a trivial example:
string cat(string x, string y) { return x ~ "," ~ y; }
string s = mixin(cat("hello", "betty")); // runs cat at compile time
writeln(s); // prints: hello,betty
I'll be happy to answer any further questionsMaybe I don't understand, in D, how do I write a function which makes a new type?
For example Zig has a function ArrayHashMapWithAllocator which returns well, a hash table type in a fairly modern style, no separate chaining and so on
Not an instance of that type, it returns the type itself, the type didn't exist, we called the function, now it does exist, at compile time (because clearly we can't go around making new types at runtime in this sort of language)
You use templates and string mixins alongside each other.
The issue with mixins is that using string concatenation to build types on the fly isn't the greatest debugging experience, as there is only printf debugging available for them.
Yes and D's comptime is much more fun, IMHO than Zig's! Yet everyone talks about Zig's comptime as if it were unique or new.
But Zig doesn't need a keyword to trigger it either? If it's possible at all, it will be done. The keyword should just prevent run-time evaluation. (Unless I grossly misunderstood something.)
I'm no expert on Zig, but "comptime" is the keyword to trigger it.
I'm pretty sure the "comptime" keyword only forces you to provide an argument constant at compile time for that particular parameter. It doesn't trigger the compile time evaluation.
That's how the constant is provided - through compile time evaluation.
Yes, but compile-time evaluation in Zig doesn't require the "comptime" keyword. Only specific cases such as compile-time type computation do (but these specific cases are not provided by compile-time function evaluation in D anyway, so language choice wouldn't make a difference here).
Partial evaluation has been quite well known at least since 1943 and Kleene's Smn proof. It has since been put to use, in various forms, by quite a few languages (including C++ in 1990, and even C in the early seventies). But the extent and the way in which Zig specifically puts it to use -- which includes, but is not limited to, how it is used to replace other features that can then be avoided (and all without macros) -- is unprecedented.
Pointing out that other languages have used partial evaluation, sometimes even in ways that somewhat overlap with Zig's use, completely misses the point. It's at least as misplaced as saying that there was nothing new or special about iPhone's no-buttons design because touch screens had existed since the sixties.
If you think Zig's comptime is just about running some computations at compile time, you should take a closer look.
I'd like to see an example! as I cannot think of one.
An example of what?
Not OP, but I guess based on your comment:
> But the extent and the way in which Zig specifically puts it to use -- which includes, but is not limited to, how it is used to replace other features that can then be avoided (and all without macros) -- is unprecedented.
That MrWhite wanted to knkw an example of Zig's comptime that is not merely a "macro", rather the usage as a replacement of other features (I guess more complex..)
PS just interested in zig, I'd like some pointer to these cool feature :)
An unprecedented use.
Ok, so a primary goal of comptime in Zig is to avoid needing certain specialised features while still enjoying their functionality, in particular, generics, interfaces, and macros. I'm not aware of any language that has been able to eliminate all of these features and replace them with a simple, unified partial evaluation mechanism.
In addition, there's the classic example of implementing a parameterised print (think printf) in Zig. This is a very basic use of comptime, and it isn't used here in lieu of generics or of interfaces, but while there may be some language that can do that without any kind of explicit code generation (e.g. macros), there certainly aren't many such examples: https://ziglang.org/documentation/0.15.2/#Case-Study-print-i...
But the main point is that the unprecedented use of partial evaluation is in having a single unified mechanism that replaces generics, interfaces, and macros. If a language has any one of them as a distinct feature, then it is not using partial evaluation as Zig does. To continue my analogy to the novel use of a touchscreen in the iPhone, the simplest test was: if your phone had a physical keypad or keyboard, then it did not use a touchscreen the way the iPhone did.
D's `write` function is generic:
write(1,2,"abc",4.0,'c');
write is declared as: void write(S...)(S args) { ... }
where `S...` means an arbitrary sequence of types represented by `S`. The implementation loops over the sequence, handling each type in its own individual fashion. User defined types work as well.If D has a separate feature for one of: generic types, interfaces and macros, then obviously it doesn't use partial evaluation similarly to how Zig does. It seems to me that it has all three: templates, interfaces, and string mixins. So if Zig uses its unified partial evaluation feature to eliminate these three separate features, why bring up D, which clearly does not eliminate any one of them?
It's like saying the the iPhone design wasn't novel except for the fact that prior art all had a keypad. But the design was novel in that it was intended to eliminate the keypad. Zig's comptime feature is novel in that it exists to eliminate interfaces, generics, and macros, and you're bringing up a language that eliminates none of them.
So D clearly isn't an example, but perhaps there's some other language I haven't heard of. Just out of curiosity, can a printf in D not only check types at compile time but also generate formatting code while still allowing for runtime variables and without (!!!) the use of string mixins? Like I said, it's possible there's precedent for that (even though it isn't the distinguishing feature), and I wonder if D is that. I'm asking because examples I've seen in D either do use string mixins or do not actually do what the Zig implementation does.