[2020], and written for IOCCC: The International Obfuscated C Code Contest.
This was awarded "Best of Show - abuse of libc" at the time[0]. See also the judges' remarks[1]:
This program consists of a single printf(3) statement wrapped in a while loop. You would not think that this would amount to much, but you would be very, very wrong. A clue to what is happening and how this works is encoded in the ASCII art of the program source.
How did we end up with printf - within a loop - being Turing-complete? Was it designed that way from the beginning? Were new features added over time until we got there?
Having something Turing-complete is surprisingly easy, and it hides everywhere. The repository have a small document that explains how you can use printf() as a computer : it can performs additions, logical union and negation, which is enough.
It was unintentional, but Ken Thompson being Ken Thompson, can't be 100% sure.
It was probably unintentional, yeah, I don't recall any mentions of early printf being overloaded to do stuff, nor is it clear why you would do that since you're using it in a much more convenient Turing-complete language already (C).
This was awarded "Best of Show - abuse of libc" at the time[0]. See also the judges' remarks[1]:
This program consists of a single printf(3) statement wrapped in a while loop. You would not think that this would amount to much, but you would be very, very wrong. A clue to what is happening and how this works is encoded in the ASCII art of the program source.
[0] https://www.ioccc.org/2020/index.html
[1] https://www.ioccc.org/2020/carlini/index.html
reply