RE: “There is only one OS, and it’s been obsolete for decades”

Today I ran across an article berating the general legacy of computing, complaining that the major operating systems of today are obsolete and therefore bad. Since the article is a veritable jungle of inaccuracies, mistakes and deliberate falsehoods, and carries a prevailing sentiment of non-constructive pessimism, I am going to sift through it piece-by-piece and do my best to explain how this is an incredibly dim-witted and inaccurate message to take home about computing and programming. You can read the original article here; it is written by someone who apparently goes by the name Joel.

Setting the stage

Joel opens the article with a pretty ballsy claim: there is only one operating system, and it’s been long obsolete. This notion quickly dissolves as it ois further explained that, in his view, all relevant operating systems fall under the umbrella of UNIX. While this would make sense for several if not most OSes, with Linux and *BSD bearing a great deal of respect for the philosophies that immortalised UNIX, and macOS holding a licence to call itself a UNIX OS under trademark, he lumps these together with Microsoft Windows, citing ‘many similarities’ to the original OS, even though Windows descends from MS-DOS, which descends from Q-DOS, a clone of the CP/M operating system sought by IBM in the mid-1980s for their famed IBM-PCs. Joel says that ‘whether there is a dependency does not affect [his] point’, and I suppose this lack of shared history is what he is dismissing here.

So from the start, we have this questionable and conveniently specific definition of UNIX that effectively means ‘all relevant OSes today’. While he doesn’t go into further details about how he figures this, he says that this amounts to a monopoly, whose effects he explains in the rest of the article.

“Portable assembly”

To start, a few quotes from the section on OS interfaces:

Let us note that Windows, Unix, and hence Linux and Mac, are written in C.

It’s also true that much of the Windows OS is written in C++, and Mac/iOS in Objective-C. I won’t talk about Objective-C, as what little I know about it suggests that it does not represent computing quite so outdated as C and C++. And as for C++, well, forgive me if I draw attention to it being largely a superset of C, and that while COM may be considered a part of Windows, the kernel along with the vast majority of system libraries show no evidence of C++.

Here we can see how clueless he is about the inner workings of what he’s about to go on berating, in the name of ‘UNIX’. He admittedly knows next to nothing about Objective-C, and believes it is not as ‘outdated’ as C and C++. Let us note that both of these languages have been updated by ISO as recently as 2011, whereas Apple released the current major revision of Objective C at WWDC 2006, and has been slowly phasing it out for all of their platforms in favour of Swift. He then makes the unverifiable claim that ‘the kernel along with the vast majority of system libraries show no evidence of C++’ in reference to Windows NT, and conflates that with a vague mention of the Component Object Model, a language-agnostic ABI standard made by Microsoft in 1993. By the looks of this, we’re really not getting off to a good start here.

Now: Unix is an operating system that was developed in the 1970s. Its basic ideas thus reflect computing as it was in the 1970s.

Now we’re getting into the things that are either borne from complete and utter ignorance of recent history, or an intellectual dishonesty that aims to handwave the labour and genius of the philosophy that UNIX carries on in today. It doesn’t acknowledge any details of UNIX in the abstract, and it seems this is deliberate: the author goes on later to attack UNIX as a practical vestige while ignoring every theory and abstraction that without computers would still be canisters of wires. It is to say, “The internal combustion engine is an invention that was developed in the mid 19th century. Thus, its basic ideas reflect technology as it was in the mid 19th century.” This is a complete failure to acknowledge the cascading effects of the invention, discarding everything it had a hand in enabling, and conflating it with an ancient device that has long since fallen out of use in all sectors of society. This reduces the meaning of UNIX, and is coupled with the special definition of UNIX detailed earlier so as to reduce the meaning of all major operating systems today.

C was, in fact, developed specifically for Unix, as an alternative to writing system software in non-portable assembly. Hence, it also reflects computing as it was in the early 1970s.

At first glance, this claim appears a lot more reasonable than the earlier rebuking of UNIX. But alas, it is also a victim of the same temporal error we went over earlier: C was popularised in the late 1970s, beginning with a book first published in 1978 by Brian Kernighan and Dennis Ritchie called “The C Programming Language”. The language was first standardised by ANSI in 1989 and since standardised several times by ISO (most recently in 2011), showing that the C programming language has undergone many significant revisions since its conception. The author had a chance to explain the true setbacks presented by C – backwards compatibility – but instead opted to say it is old and therefore outdated, and therefore bad. There have also been a set of tenets held by the standardising committee since time immemoriam regarding their objectives in improving C, which would imply that C is not reflective of computing in the 1970s, but that was omitted as well.

The next few paragraphs go on to chalk up C as obsolete by design by virtue of Moore’s Law, saying that ‘computing in the 1970s was a radically different place than it is today’. It completely ignores the shift of burden C underwent alongside this transition to targeting all manner of embedded devices, microcontrollers, and firmware images that demand similar constraints to a Commodore to this day. Saying, “the creators of Unix were dealing with machines having a mere 8K memory words”, where Nintendo’s Game Boy Advance has 16KiB of memory on-die, and 256KiB on-chip to work with, with games being made for it well into the 21st century. In summary, the claim that “we’ve outgrown C” is demonstrably false, and arguably this whole exercise was in futility as it would be a red herring from the fact that outgrown or not, operating systems still use it, so it’s necessary for getting some kinds of work done. We’ll go into the bill of attainder against C’s design for performance next.

In addition to the many features that would simply make life easier (nested functions, better types, proper error handling) not being available in C, the language also omits important safeguards against inevitable human error. Now, again, perhaps this was appropriate for the world of 1971, [...]

This goes to show the author hasn’t an inkling of an idea of what is involved in creating the things he just rattled off as features. C does not mangle symbols for functions, and this was a deliberate design decision to simplify the language, and make it portable and performant. Implementing nested functions would require name mangling outside of private implementation bits, but then the same effect would be achieved with a static qualifier and a name that the compiler doesn’t complain about. I don’t suppose he realises this, but the very same line of reasoning was used to make C’s built-in primitives the way they are. The standard could not presume the size and nature of a char or an int, and elected to leave it up to implementations to figure this out, which they indeed did. Yes, it sounds unfathomable, but there are platforms where a char is not guaranteed to be 8 bits! No, they’re not sitting in a museum – you probably know somebody who has a machine that does this! Many kinds of GPUs define chars as 16 bits wide, because their hardware cannot communicate at a smaller granularity than that. Thankfully, you don’t have to worry about this because both ISO and your compiler vendor are working together to do everything in their power to make C work the way you expect. Since 1999 there has even been a standard header that defines exact-width integer types when you need them. The last thing he mentions here is ‘proper error handling’, and since the prevailing norm for that is in the form of exceptions, I’m going to explain that this is also a deliberate design decision made for very practical reasons, in the interest of portability. C does not feature any kind of exception handling, as it was impractical early in its existence and not easily backwards-compatible by the time the standards bodies were pressed to introduce it. Besides being incredibly complex to implement and being outright impossible without some degree of runtime support, exceptions present a complexity that is arguably out-of-scope for the C language, and irrelevant as C++ has implemented them since 1998 anyway, shouldering the compatibility burden mentioned earlier. Wow, it’s almost like C is the ultimate portable programming language! Who would’ve thought that being portably performant has drawbacks?

How many billions of dollars in damage have been caused by, say, the simple lack of bounds-checking on memory buffers? How many computer systems have been rendered unusable, their files ransommed or obliterated, by malware that found its way in through buffer overflows as a result of C’s standard library? The victims were all sacrificed on the altars of “Inertia” posing as “Efficiency”.

Here’s a big fear-mongery appeal to emotion, salted with some overzealous evangelist undertones and metaphors of religious orthodoxy and evil empires. Nothing but hacky charlatanry. Rather than addressing this issue with C head-on, the author slivers around it and tries to tear down the entirety of C and by extension UNIX using these shameful manipulations of conscience. Let me ask a question about C and safety: what could anyone do instead? Can we just travel back in time and never invent C, because of how inherently terrible it is by this account? Is our reproach to losing billions of dollars to buffer overflows and stack smashing to spend billions of dollars rewriting every library in existence in Rust, as if that’s even remotely practical? He presents no solution at all to this problem, and so it is nothing more than destructive criticism bearing no answers and no realistic alternative. By golly, if C simply “isn’t good enough”, then say that. Don’t gaslight it as some irreverent monstrosity of bad design and obsolescence that needs to be wiped off the face of the earth, because it isn’t. I haven’t even delved into the countless feats of performance optimisation in C that depend on the lack of safety to have existed as early in time as they did. Safety complicates implementations considerably, so it’s contestable that the ‘shoulda-woulda-coulda’ argument would solve the safety problem anyway.

“The console is terrible”

In section 2, Joel goes over a brief history of the teletype, and explains how it is the ancestor of the modern console in use today. He explains that the static and textual nature of printers have made consoles very limiting as a medium of expression, ignoring the existence of terminal emulators (both with and without display servers), 8-bit colour modes, curses for interactive display, mouse support, internationalisation and endless programmability (read: extensibility) as features borne to the TTY since the days of the old teletype. Then he rattles off various UNIX commands and signal shorthands that exist that way because teletypers were old and clunky and they were short on memory, and totally not because they take less time to type, even on a computer keyboard (especially on a computer keyboard, duh!). The world would definitely be that terrible if it were true that all those things somehow don’t exist, but they do, so this doesn’t hold much water.

“Everything is a file”

Let’s look at this fundamental pillar of the Unix philosophy. What, exactly, is a file?

“A stream of bytes”? Really? What sort of a computing abstraction is “everything is a stream of bytes”??

At the very least, it’s an abstraction that didn’t exist until UNIX was invented. He goes on to simplify the definition from bytes to bits, and fail to recognise the magnitude of the concept he’s shoving under the bus here and filing away under “stuff we already know”. You wouldn’t know it if it weren’t for UNIX, buddy! Before then the norm for computing were gigantic boxes of switches and wires, and the notion of files hadn’t even been realised in theory to this extent by anyone. Ritchie and Thompson gained international academic acclaim for this. Nobody’s asking you to kiss their shoes, but you should have more perspective than this about the philosophy before you critique it.

The next topic of berate in the article is all about text.

“An obsession with unstructured text.”

This section presents a false dichotomy between ‘binary files’ and ‘text files’, showcasing a hexdump of each by his account. He then defines ‘binary file’ to have the relative meaning of “not a text file”, which is false as all files are streams of data (which he said earlier). Binary files are inherently without format or meaning, and he explains the benefit of text files is that ASCII exists so there is a standard format for reading plain text. Pretty basic stuff. But then, this happens:

Recall what happened: we decided to represent characters of text, including punctuation and whitespace, using fixed numbers (a practical, though rather dubious, decision; again, a product of early computing). We then developed tools to display said text and modify it using input devices.

And then we complained that binary files are hard to read and write, using these tools. Well pardon my French, but no shit, Sherlock!

The tirade continues to declare UNIX filing as a big circle of nonsense, failing to acknowledge even the existence of file systems, directory hierarchies, file names and extensions, MIME types, block and character devices, networking with files, and a plethora of other vital characteristics of the ‘everything is a file’ tenet of UNIX. No, it’s terrible because we don’t have the conscience and will to open a file and see if it’s actually legible or not. This barely qualifies as trying!

The tyranny of parsers

The big problem with text-as-ASCII (or Unicode, or whatever) is that it only has structure for the humans who read it, not the computer. To the computer it is still just a long list of numbers; barely any more structured than a stream of bits, So, in order to do anything with it (such as compile source code), it needs to be parsed.

And to the computer, it is still a long list of numbers. The act of transforming text does not make it any more meaningful to the computer than it was before, as everything is data and instructions for the computer at the end of the day. Humans give meaning and purpose to a parser just the same as they do ASCII or Unicode – not really saying a whole lot here.

Now, this is always going to be necessary for anything resembling language, as at some level there are always lists of ‘things’–letters, words, clauses, sentences, paragraphs–that can be structured by humans.

I guess this is the realisation of the conflation of purpose earlier, and later on he takes this conflation and asserts the (false) claim that an Abstract Syntax Tree (ergo, a ‘binary representation of language’) is a prerequisite of text for as long as you hope to do anything useful with it. At this point we’re at least two levels of abstraction above ‘binary data’, and so it’s desirable to use ASTs as an alternative representation of this higher-level content so it may be used effectively (as parsed) and stored effectively (as serialised). This section is a mess because the concept of language has been conflated with text itself, as if to say ASCII mandates an AST and a parser, which it certainly does not. The claims he makes to say text files inherently require parsing, that their ASTs are ‘thrown away’ and not cached, are therefore false. His complaints against ‘Unix culture’ and programming in general decrying people ‘inventing new text file languages’ and ‘propping up optimisations from half a century ago’ have no basis.

Processes and debugging + Parser Tyranny: Round 2

Most of the content here consists of elementary explanations of a kernel’s userspace with processes, which he conflates with the parser tyranny nonsense from earlier and pretends it’s some ghastly burden on OS infrastructure even though it doesn’t exist here. As with most of the article, there’s an overabundance of text bolding, in case you’re really sleepy and are having a hard time following along.

The tirade on debugging is very simple and easy to break down: debugging symbols are stripped for release builds of native programs shipped to the public for both performance and security reasons, and the author has no idea that the errors users might get are easily tracked down and solved using those debugging symbols available to developers. After all, we don’t generate them for nothing! Then he goes on to complain that debugging is hard, and it’s a hundred times harder with C++ because C++ is complicated. Nowhere to be found is any mention at all about debugging software, be it Visual Studio, GDB, LLDB, Valgrind, or anything else that’s just about vital for a systems programmer who’s employable. I guess all he knows is printf debugging, and yeah, that would suck if it was the only way to do it. It isn’t.

GUIs are terrible, by association with C

Skipping the first two paragraphs, since they amount to little more than branding the rest of the system as you imagine it as guilty-by-association with the Evil C Empire, we continue to the third paragraph to find something absolutely insane:

If you want to increase the font size of some window, then the application designer must have already explicitly accounted for this. If they didn’t, then all you have is a big block of bytes that you know must use other blocks of bytes, but what for and where and why is completely unknown and opaque, because any structure that is unnecessary for the computer has been optimised into oblivion. And if this is the debugger’s GUI we’re talking about, then you’d need to debug it with another debugger to have any hope of getting what you want! And even if you succeeded in changing the process at runtime, getting these to persist past the “end of the world” is another responsibility entirely.

There is nothing in this paragraph that has any basis at all in reality. If you want to do anything at all in any computer program ever, the developer has to have accounted for it before-hand! Has your hand been held by the developer tab of Chromium for so long that you think any of this stuff is a given, or to be expected somehow? You don’t have a big jumbled mess of blocks, you have a computer program that’s compiled to run in memory, doing what you told it to do. If that wasn’t the case your disks would have been wiped long before you could have the chance to write this hogwash. There is no chicken and egg problem with debuggers either, and half of me doesn’t even want to know what you mean by persisting past the “end of the world”.

The ‘conclusion’

I certainly hope that, in twenty years time, Unix and its influence will have been forever banished to the history books, where it should already live. Or, all out other options will have been exhausted and found to be worse. But I’ll let you be the judge of how likely that is.

No need to hope for that, as it’s already in the history books, and has been for over twenty years. Unfortunately, it’s not going away in terms of influence any time soon, and I’m afraid you’ll have to live with that along with the rest of us because there’s no revolution or time machine in sight.


Bringing back the Thursday Nite Rant, for dril.