Precedent for Zig's comptime?

Zig’s “greedy” compile time evaluation of expressions (and even functions, if requested) seems pretty unique to me.

Question One: Does anyone know other languages with this feature.

NOTE: I’m aware that optimizing compilers will often do this, but I don’t consider that a language-level feature the way comptime is in Zig.

Question Two: How would you classify Zig’s metaprogramming capabilities?

It’s really interesting because Zig achieves metaprogramming without a separate metalanguage or dialect. At the same time, it’s not homoiconic in the sense that you can’t manipulate code as data.

It has reflection, but only at compile time (that’s correct to say, isn’t it?).

It has generics without really having to provide much in the way of explicit language-level support for generics (anytype being an obvious exception, and the builtin @Type, and probably other things I don’t know about).

The closest thing I’ve heard of (but never used, I’m not that old) are the procedural macros in PL/I’s preprocessor which is PL/I code written in the same language but explicitly run at compile time:

Macros in the PL/I language are written in a subset of PL/I itself…

When a % token is encountered the following compile time statement is executed… Thus a compile time variable PI could be declared, activated, and assigned using %PI='3.14159265'. Subsequent occurrences of PI would be replaced by 3.14159265.

Which is clearly the same idea, but it’s just for a small subset of the full PL/I language.

Related Wikipedia topics:

4 Likes

I think my background in “memory managed” languages has messed up my mind regarding comptime. I just can’t seem to get the intuition of when I can and can’t use comptime. For Pete’s sake the name itself is supposed to say it all: comptime == compile time, but even so, I keep thinking “I can’t do this with comptime because what if the input is only known at runtime?” And then I end up never using comptime at all. I think this is an area where more in-depth learning material is needed.

4 Likes

Thank you for saying this. I completely agree. I like it a lot, but I don’t think the way ‘comptime’ works is obvious at all. Just when I think I understand it, I write a test program and realize I didn’t.

For what it’s worth, I’m in the middle of writing the comptime examples for Ziglings right now.

There will be at least two more comptime exercises.

In creating these, I found that many things that I thought I could demonstrate NOT working, did work. And sometimes they worked in ways I didn’t expect.

In fact, in some cases, it’s surprisingly hard to write something that Zig won’t evaluate at compile time. It’s really, really good at evaluating things at compile time!

I’m starting to understand it properly now, I think. But much of what I previously thought I knew was wrong!

I keep seeing people say: “you could learn Zig in a day” and “it’s so simple and easy.” Those people are evidently much quicker learners than I am. :laughing:

At any rate, I think comptime is fantastic, but at some point we’re going to need to have some really clear documentation explaining the exact rules in a systematic fashion.

8 Likes

Wow, those Ziglings exercises are going to be an awesome resource. Thanks a lot for the great work. I remember a professor that always said the best way to learn something is having to teach it to someone else, so there you go! :smiley:

2 Likes

Thanks! I’m hoping these will be helpful. I’m also a big believer in teaching something to learn it more thoroughly. :brain:

3 Likes

Forth immediate words are very much like comptime.

1 Like

Oooh, good one!

I have a copy of Leo Brodie’s Starting Forth book here on my shelf, but I never quite made it to the last chapter where I would have seen this:

https://www.forth.com/starting-forth/11-forth-compiler-defining-words/

We have used the term “run time” when referring to things that occur when a word is executed and “compile time” when referring to things that happen when a word is compiled. So far so good. But things get a little confusing when a single word has both a run-time and a compile-time behavior.

The word IMMEDIATE makes a word “immediate.” It is used in the form:

: name definition ; IMMEDIATE

that is, it is executed right after the compilation of the definition.

And there’s also POSTPONE:

The word POSTPONE can also be used to compile an immediate word as though it were not immediate.

And the left [ and right ] square bracket words:

There are two other compiler control words you should know. The words [ and ] can be used inside a colon definition to stop compilation and start it again, respectively. Whatever words appear between them will be executed “immediately”, i.e., at compile time.

I loved the time I spent bending my mind with FORTH. Plan to do it again someday.

2 Likes

For question one, jai also can evaluate arbitrary code at compile time using the #run directive. I didn’t look too deep into the language, not sure if zig borrows from it.

1 Like

Yeah, from what I can tell, you’re right about the similarities. I think Jai’s #run is less constrained than Zig’s comptime (I remember seeing one of Blow’s earliest demos playing Tetris during compile time - and if I remember correctly, having the compiled output depend on the game score or something like that, which was hilarious and awesome), but beyond that, I’m not sure how to compare and contrast them.

As for Zig borrowing from Jai, the timeline makes it possible (2014, 2016), but I get the impression Zig’s design mostly stems from @andrewrk’s meditations on the pitfalls of C and has evolved from there.

The way Jai does it is a footgun and a half. It runs the code in the compiler’s memory space, on the host system, whereas Zig’s comptime emulates the target and forbids runtime side-effects. So with jai you can crash the compiler, cause side-effects during compilation, and accidentally introduce host dependencies into your compilation. Because of this, Jai’s cross compilation story is hopeless.

5 Likes

This is a point you make often when explaining comptime - do you (or does anyone else) have an example of where this target emulation makes comptime more useful or powerful than it would be without it?

1 Like

If I understand correctly, he’s saying that comptime should run the same way whether you’re compiling natively on your target, or cross compiling to that target.

If you don’t emulate your target’s comptime semantics when cross-compiling, then by definition, you’ll get different results than if you were compiling natively.

That being said, I’m not sure how Zig’s comptime behavior changes across targets, would be interested to know.

1 Like

Thanks, that makes sense - I use comptime so much for things like type construction and inlining (which don’t really depend on the target) that I completely forget how many kinds of code can run at comptime.

A simple example to your point about cross-compilation, even just @bitCast would have to emulate the target endianness to give consistent results.

3 Likes

Thanks @marler8997 and @jdknezek, that’s a whole angle of comptime I hadn’t even considered yet. The @bitCast() builtin seems like a great example. Very interesting!

Elixir supports compile time code in addition to metaprogramming pretty well.

defmodule Example do
  if Mix.env() == :prod do
   def logger() do 
       Sentry.Logger
   end
  else 
    require Logger
    def logger() do 
       Logger
    end
  end
  # use the logger()
end

Edit:
In elixir, this goes as far as allowing you to compile time generate documentation.

For example, I use the following code to ensure my documentation always matches my enumeration of possible states:

defmodule MyApp.ChangeApprovalModule do

  # Compile time define possible statuses and their description
  @approval_statuses_spec [
    {:pending_approval, "the changes are yet to be reviewed and accepted"},
    {:approved, "the changes have been confirmed by an inspection officer"},
    {:deleted,"the changes were once in an `:approved` or `:pending_approval` state and were deleted either by the approval of changes that were `:pending_approval` or an updating of the `:pending_approval` list."}
  ]

  # Compile time extract the status name in the spec tuple
  # @approval_statuses is now [:pending_approval, :approved, :deleted]
  @approval_statuses @approval_statuses_spec |> Enum.map(fn {status, _desc} -> status end)


  # Compile time define the documentation by interpolating the spec into the documentation
  @moduledoc """
  Changes are in one of three states. They may be:
  #{
    @approval_statuses_spec
    |> Enum.map(fn {type, details} -> "* `#{type}` - #{details}\n" end)
  }
  """
end

This is how the module is defined in my IDE.

2 Likes