![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
This is a repurposed CIX comment. It goes on a bit. Sorry for the length. I hope it amuses.
So, today, a friend of mine accused me of getting carried away after reading a third-generation Lisp enthusiast's blog. I had to laugh.
So, today, a friend of mine accused me of getting carried away after reading a third-generation Lisp enthusiast's blog. I had to laugh.
The actual history is a bit bigger, a bit deeper.
The germ was this:
https://www.theinquirer.net/inquirer/news/1025786/the-amiga-dead-long-live-amiga
That story did very well, amazing my editor, and he asked for more retro stuff. I went digging. I'm always looking for niches which I can find out about and then write about -- most recently, it has been containers and container tech. But once something goes mainstream and everyone's writing about it, then the chance is gone.
I went looking for other retro tech news stories. I wrote about RISC OS, about FPGA emulation, about OSes such as Oberon and Taos/Elate.
The more I learned, the more I discovered how much the whole spectrum of commercial general-purpose computing is just a tiny and very narrow slice of what's been tried in OS design. There is some amazingly weird and outré stuff out there.
Many of them still have fierce admirers. That's the nature of people. But it also means that there's interesting in-depth analysis of some of this tech.
It's led to pieces like this which were fun to research:
http://www.theregister.co.uk/Print/2013/11/01/25_alternative_pc_operating_systems/
I found 2 things.
One, most of the retro-computers that people rave about -- from mainstream stuff like Amigas or Sinclair Spectrums or whatever -- are actually relatively homogenous compared to the really weird stuff. And most of them died without issue. People are still making clone Spectrums of various forms, but they're not advancing it and it didn't go anywhere.
The BBC Micro begat the Archimedes and the ARM. Its descendants are everywhere. But the software is all but dead, and perhaps justifiably. It was clever but of no great technical merit. Ditto the Amiga, although AROS on low-cost ARM kit has some potential. Haiku, too.
So I went looking for obscure old computers. Ones that people would _not_ read about much. And that people could relate to -- so I focussed on my own biases: I find machines that can run a GUI or at least do something with graphics more interesting than ones before then.
There are, of course, tons of the things. So I needed to narrow it down a bit.
Like the "Beckypedia" feature on Guy Garvey's radio show, I went looking for stuff of which I could say...
"And why am I telling you this? Because you need to know."
So, I went looking for stuff that was genuinely, deeply, seriously different -- and ideally, stuff that had some pervasive influence.
That's led to something encapsulated by a comment I read recently -- can't recall where, but it's not important. The comment was that some programming languages make you look at the world in such a different way that from then on, you see everything differently. And that two of the big ones are Smalltalk and Lisp.
Well, people don't know much about Smalltalk machines, but their historical significance is well known, inasmuch as they inspired the GUI and the Apple Lisa and Macintosh. Besides, Smalltalk is not dead. It's very much around on other OSes, as a language.
And even on the native Smalltalk machines, they weren't Smalltalk on the metal: underneath was a relatively conventional (for the time) OS written in Mesa.
So not so much retro stuff to write about there.
The other language-that-changes-your-view-of-the-world is Lisp. And there, I uncovered some vast iceberg, a whole alternate universe of computing that never quite happened. And once you know about it, you see little signs of it peeking through all over the landscape of IT in apparently-unconnected places. I am still trying to find out enough about that that I can explain it coherently, and looking for ways to do so. It's such a big subject that I am still digging, still learning, 7 years later.
Note: this is not even remotely the same as wanting to learn to program in Lisp.
So while I go exploring this landscape, trying to build up a coherent picture and trying to find ways to explain it and its importance, I am looking at other unconventional areas of computing.
Some of them are relatively trivial implementation details, like tablets versus laptops -- they don't go a lot deeper than UI and HCI stuff. Interesting but not profound, although it shapes the fate of multi-billion-dollar companies.
Others are far more important, although on the surface, they will (may) result in machines that will look near-identical to their owners and users. Such as single-level store machines.
And in trying to explore these ideas, talk about them, experiment with ways of talking about them, my sole remaining use for CIX is this conference, to bounce around some ideas.
The responses are usually furious and vitriolic.
Depending on my mood, I find this either interesting, amusing, or just dispiriting.
One angle of the picture that I am developing overall is that human culture has a far more important effect on the development of computing than people realise.
Different languages, different OSes, different ways of working deeply affect the perceptions of those in them.
I am a multiplatform sort of guy and always have been. I have, at times, been a Sinclair Spectrum user, then an Acorn Archimedes user, then an OS/2 2 user. I surrendered to Windows for a while, because I couldn't afford a Mac. Then I went to Linux, and was also an early adopter of Mac OS X.
Now I'm very much on the Unix side of things, but only passing through. I use Macs, I use Linux, I avoid Windows. Despite that, I know more about the innards and history of Windows than almost anyone I know, because I spent a quarter of a century supporting the damned thing. Still occasionally do but thankfully very rarely now. I am not some shell-scripting devops type, I'm not a Pythonista or anything. I just use 'em. I was a deep Windows guru once but now only dabble. I don't do any of the modern gubbins like Active Directory and really don't want to.
Amusingly, this means I have feet in both camps. I work with Windows-based corporate clients, who regard Linux as a silly toy of no real use or importance. I have often been called a member of the Linux taliban, a fundamentalist penguin-fondler. I've had to sign contracts that I will never mention Free Software to customers. Seriously.
I also work with Linux companies and users. They regard Windows as an obsolete curiosity, a toy that's no longer of any importance or relevance to anything. They sometimes mock me for the fact that all my laptops dual-boot and I still fiddle with Windows 10.
Many of my friends in Western Europe use Macs now. I get on with them. Here in Eastern Europe they are extremely rare and only the rich use them. My friends all run pirated Windows and pirated Windows apps and regard me as a crank for using these silly fashion-victim toys.
But my Apple-owning friends cannot fathom why I don't like iOS and don't use an iPhone and an iPad. They simply cannot assimilate that I don't like them much and regard them as overpriced.
Everyone sees me as a member of the other camp, and thus hideously biased and blind to core truths.
That amuses me vastly.
In here, in a den of grey bearded dungaree-clad grumpy old Unix heads, as per Dilbert:
http://dilbert.com/strip/1995-06-24
... I don't fit in either. That's OK. :-)
So, back to culture.
There are various different cultures of computing. In terms of living ones, Windows is one, Linux is another, the bigger world of Unix is another. Mainframes are another.
But as I said with retrocomputing, when you step back and take a bigger look, you can see that actually, these are all very close kin. Windows and Unix are almost twin siblings compared to classic MacOS, and MacOS is far more like Windows (well, DOS, given the time) in some ways than Lisa OS was.
Compared to a Smalltalk box or a Lisp Machine, they're all almost indistinguishable.
Windows users are used to drive letters, not being able to replace the GUI, to using MS tools and protocols and so on. Unix types see all that as fungible, with their one big filesystem tree, everything's-a-file philosophy and so on. They sneer at drive letters.
Then you look at Plan 9 and realise that everything in Unix isn't a file, and many of the files are special or magic. Plan 9 redefines a lot of stuff and integrates network awareness deeply in a way Unix never did. It makes bolt-ons like NFS and rsh look very silly.
Then you look at Inferno and realise that Plan 9 is still deeply tied to the mindset of writing in C and compiling it for a particular CPU, which itself makes the network-transparency a bit of a mockery. Inferno also make the JVM look like a sad kludge.
Then you look at Tao Intent, and realise that Inferno's attempt at CPU-independence was a bit of a damp squib, with all these different versions of Inferno targeted at different platforms, including running as a browser plugin.
And when you get to that sort of level, then you realise that the mindset of text files versus executable binaries, and writing in an editor and passing it through a compiler and a linker, and trying out the result -- why, modern Windows and Unix are remarkably similar, really. They just have different compilers and arrange the filesystem differently, but conceptually, they're very very alike. And it makes sense that modern scripting languages can run on both without any problem at all.
And that’s one of the cultures I’m talking about. In this cultural context, which language you use is very important. Some need to be compiled, some don’t. Some are type-safe, some aren’t. Some run mainly in browsers, some on OS-hosted interpreters, some in specialist VMs, and some run on the bare OS. Some are good for web apps, some for local interactive apps, some for server automation. Some are faster to write, but run slower; some are slower to write, but run faster.
And there are endless squabbles about which is better for what. People even fight over the indentation style.
Then you visit a different world — for example, RISC OS. A very good BASIC is part of the OS, in the ROM. It’s fast, it’s OS-integrated. So there is no debate about scripting languages and so on: from a quick installer, to simple GUI app dev, it’s right there. For serious stuff, well, the OS has its own proprietary C compiler. So the debate just goes away. There’s no current Python or Perl or JVM or any of that, so all those considerations just disappear.
But this leads to another different world — where a decent high-level language is an integral part of the OS. That leads to Lisp Machines, to Smalltalk boxes, to Newton OS and the abortive Dylan-based Newton OS that never happened. Where there’s only one language, and it does everything, and all the arguing about compilers versus interpreters versus VMs just isn’t a consideration.
From that position, even things like Plan 9 and Inferno, even Intent, they’re all a bit primitive. Pass source through a compiler and link it to other static compiled code? Really? You still do that? That’s so quaint! And you can’t test it until it’s compiled? But then you can’t modify the code or tweak a variable and check without recompiling? And that’s normal for you?
There is a lot of intellectual baggage in the contemporary (including the last couple of decades) of Windows/Linux/Mac computing, but it’s so pervasive that I strongly suspect that people do not realise it’s culture. That if you want performant code, you _must_ use a relatively low-level language that’s “close to the metal”. C or C++. The deep inherent problems of using such type-unsafe, un-bounds-checked languages are just a given: all the OSes and all the apps people know are written this way. Of course that’s how you do it. It’s just how it is. You just need to be careful. The risks aren’t that bad, because everyone takes them. There’s no realistic alternative.
Less critical stuff can be done in various safer scripting or VM-based languages. Issues that of course those are written in C/C++ and therefore not safe themselves are ignored, because everything is. Matthew 7:24, houses built on sand, etc.
There’s vague awareness that there were different approaches — e.g. Delphi, VB. But they’re not taken very seriously by anyone any more. VB evolved into .NET, sort of Microsoft’s proprietary rival to Java and the JVM, so not for anything performance critical.
The C family is so pervasive in this culture, anything else is not serious, it’s not practical. It’s too obscure, or an academic toy, or too slow. To anyone immersed in this culture, there’s massive diversity. It blinds people to the fact that the choice is C/C++ or something implemented in C/C++ which has some degree C-like syntax:
https://twitter.com/smdiehl/status/855827759872045056
«
C syntax is magical programmer catnip. You sprinkle it on anything and it suddenly becomes "practical" and "readable".
»
The flipside: if it isn’t, it’s not practical. Pascal family? Old-time educational use only. Too restrictive. Lisp style? Only for weirdos. Forth style? Didn’t that die in the 1980s? FP? Only for academics and weirdos. Fortran? Didn’t that die in the 1970s? Etc. Etc.
It’s a mono-culture layer-cake, and anything from outside those layers is just silly, impractical, unreadable, too slow, too restrictive.
To look at it intersectionally, I think it ties in with a culture of masculinity and “brogrammers”. Yeah, sure, feebs and fools can’t do pointer arithmetic safely, but I can, because I’m competent and professional. And we can issue a patch later.
But that paragraph will alienate umpteen readers who’ll stop reading there and then. I dare you to continue.
Looking at the world outside of this culture, there are some whole alternate universes. I’m trying to sketch a map.
There’s the Pascal/Modula-2 family. It begat several machine architectures, processors and all: the Lilith being maybe the least obscure. It begat an OS, Oberon, inspired by Xerox PARC and the Smalltalk machines just like the Mac was, and Windows was by the Mac. It grew up too and became AOS with the Bluebottle zooming windowed interface. Powerful, capable, ran whole universities.
So of course, in C world, it’s a toy.
There’s the Smalltalk family, the actual offspring of Xerox. Still around but now demoted to a programming language. Still in commercial use, has FOSS forks (Squeak, a whole OS; Pharo, running on “mainstream” OSes) which get some love. Trying to bridge into the Javascript world through Amber, and into the JVM world via Redline.
I’m still learning about this but I think it still has immense unrealised potential as a whole alternative model of OS design.
Lisp Machines are long dead, of course, but then you find that a controversial but very smart chap has basically reinvented the entire idea from scratch, tied in to privacy, encryption, crypto-currencies and so on, in the form of Urbit. So it’s not some long-dead curiosity, it’s still relevant to developments today. The language is very much alive, too. Plus there are multiple efforts to make it more accessible to people from the Algol world — Dylan, CGOL, PLOT; the readable Lisp S-Expressions project. All very very niche, all very powerful and flexible and important.
Forth evolved into colorForth , its own OS, and even begat a line of CPUs. http://www.greenarraychips.com/
So, yes, there are multiple worlds out there, within the field of general-purpose, GUI-driven, personal computers. But looking out from within the world of the C-based OS family — Windows and the Unixes, and dead solidiers from MS-DOS to OS/2 — it is the whole world, how it does things is the only way to do things, is practical and sensible because the _horrendous_ problems are only apparent if you look in from the outside, and as far as 99% of people who work with computers know, there _is_ no outside. It is the entire universe.
So it is, overall, absolutely hilarious that I can propose something really quite modest — “let’s try to imagine an OS for a machine that as standard has no auxiliary storage and so no need for an auxiliary-storage-management subsystem” — and it attracts months of abuse, harangues, ad-hom attacks, multiple repeated accusations of ignorance, foolishness and stupidity.
I talk about alternative programming language families and why they’re interesting, and all I get is “haha, you can’t even program! N00b!”
I talk about alternative OSes for mainstream hardware, and most of what I get is “haha! Idiot! If they were any good, we’d use them, but we’ve never heard of them, so they’re toys.”
I talk about profound issues and problems with the C-family ecosystem and I get “haha! Stupid non-programmer idiot! This is the only way to get real performance! Go learn Python, weenie! Come back when YOU have written an OS!”
I try to talk about how human-computer interaction and software design has not really moved on in any significant way in 20 years, and we’re stuck in a blind alley of our own devising while CPU performance improvements have stalled, and I just get “haha, newbie, you don’t understand, we have multiple cores now, map/reduce and hadoop and cloud-scale will sort it out for us!”
I talk to the retro computing fans and once I can get through the people who want to play games they loved in their teens on modern kit, I get the same tired old Amiga-v-ST boosterism, or Commodore-v-Sinclair. Try to talk to them about Lilith and Ceres, or about Helios on Transputer meshes and its relevance to TileEra masspar MIPS, or about the role of Modula-2 in Acorn ARX, and they look at you like you’ve grown an extra head and go back to talking about the latest FPGA Spectrum compatible.
It’s all very educational. Of course, actually, what people are really teaching me is about culture, belief systems, human nature, the dangers of monocultures, groupthink and so on. They’re not teaching me anything I don’t know about technology, by and large, because I already know about wildly more diverse computing stuff than my would-be enlighteners have ever even heard of.
That’s OK. Honestly, if I can get a series of articles and a book out of this, I’ll be delighted.
And who knows, maybe I’ll spark an idea and someone will go off and build something that will render the whole current industry irrelevant. Why not? It’s happened plenty of times before.
And every single time, all of the most knowledgeable experts said it was a pointless, silly, impractical flash-in-the-pan. Only a few nutcases saw any merit to it. And they never got rich.