liam_on_linux: (Default)
[personal profile] liam_on_linux
I guess that it all stems out of a vague feeling of ennui that's been growing in me for years concerning computers.

My Spectrum was an amazing toy (and I do use the word advisedly). I played with CBM PETs and ZX-81s but while interesting they could not do pictures or sound, which were things of more interest to me around 12YO or so. The Spectrum delivered sound, pictures, and a usable BASIC (I switched to Beta BASIC quite early on) at a price well below anything else. The VIC20 was too limited, the C64 had great hardware but a crappy BASIC, the Acorn 8-bits were vastly too expensive, and so on.

Then I got a job and could afford a used Archimedes. Simple, comprehensible OS, *really* good BASIC, wonderful graphics and sound beyond my meagre abilities to exploit and vast CPU power. As the late gkewney@cix said of the IBM PC-AT: "my first experience of Raw Computer Power". Well, for me it was the Archimedes, and dickp@cix's review of it in Personal Computer World was a clincher.

(You can read that here and I recommend it. It's one of the few computer reviews ever to contain quotable lines: http://acorn.chriswhy.co.uk/docs/Mags/PCW/PCW_Aug87_Archimedes.pdf )

Then I went x86. Horrible Byzantine OSes, a wide choice of programming languages but nothing that delivered the simple benefits of BBC BASIC, and I quickly lost interest in programming as a result.

What follows is 20Y of supporting the things instead.

From the late 1980s to the late 1990s, there was a lovely trend: steadily improving capabilities, significant removal of bottlenecks, and regular addition of new facilities.

For me, as someone who made his living supporting Windows, this basically ended in 2000. Windows 2000 had all the multitasking power of NT /and/ all the UI refinements of Win 9x. Like Win9x, it did plug&play, but better. It did sound and hardware 3D and so on. It was fast - slower than 9x, yes, but it gave you so many more facilities. And unlike NT4, it wasn't fragile. Change the IRQ of its network card, it didn't fall in a heap - it found the new one for itself and kept going. It could even survive changing disk controllers underneath it!

And it throttled the CPU, could sleep and wake and do it reliably, so you used it in confidence it would come back and your work would all be there.

Then it started to go downhill. XP added ugly cosmetic bloat, bundled crapware you can't remove and so on. This never reverted, it just got worse from then on.

And on the server side, whereas NT4 Server was limited in many ways, it was comprehensible and great for a one-or-2-server LAN such as in a small business. It tromped all over Netware 4 which had all the baggage of NDS which I never wanted or needed and which got in the way with all its big-multisite-network directory crap.

Then came Windows 2000 Server, which destroyed the simplicity of NT 4 Server by adding this vast unfathomable baggage of big-multisite-network directory crap.

Now I look back from a decade later.

There's a parallel in what's happened in the transition from the iPad to the iPad 2 and iPhone 3 series to the iPhone 4 series.

The Gen 3 iPhone (and the iPad, the big version) had all these new facilities - 3rd party apps, folders, kinda sorta multitasking, cut&paste - lots of critical stuff missing before was now there, and reasonably quick CPUs and decents amount of RAM.

Then the maker, casting around for a way to improve them, added super-hi-res displays - no functional improvement, just looked nicer, but made them more expensive and more power hungry, but slightly slower at some things.

But behind the surface, life got much harder for programmers, who instead of only supporting 1 screen size and 1 resolution now had to support multiple ones, some very hi-res, and that meant huge new artwork inside the program bundles. "Assets" they're called these days, I believe.

iPhone 4 and iPad 2 apps are many times larger than those for the previous generation, but with identical functionality - because all the buttons and backgrounds and icons have to be included in a whole array of resolutions, some very large. What was a few hundred kB of bitmaps now is tens of megabytes of them, just so it looks nice on a bigger screen. Because the OS isn't smart enough to have smart scaled graphics formats and so on, and it doesn't provide real resolution-independence, just pixel doubling and quadrupling.

Huge increase in size, zero increase in functionality. Now your downloads are huge, your storage gets eaten way quicker, you battery runs down faster and what do you get in return? Nothing at all except it looks a bit prettier.

Well, for me, XP and Vista and 7 and 8 are more of the same. More shiny graphics, more visual effects, tens of gigs of disk eaten, but they can't do anything significant that the older versions couldn't.

It's the same kernel, now in 64-bit. Now, instead of 20 or 30 background tasks, you have 200 or 300. There are little daemons running compositing the screen in 3D, rendering thumbnails of this and previews of that and indexing the other. There are things caching the output of other things.

Stuff is generated in one format, rendered in another and passed to something else for rasterisation, which is then transformed into a texture, saved as a bitmap, passed through 3 levels of driver and composited onto the surface of a 3D object depicting a flat partly-translucent rectangle for display.

This was developed on the Mac as a way to accelerate a Display Postscript-derived rendering language. On Windows and Linux, it's just chrome: they had perfectly good display-acceleration tech already.

On the Mac, the global search tool is lightning-fast and smart. On an overloaded 300MHz RISC chip from 1999 or so, it can instantly find a word and display a list of apps named that, documents containing it and so on, in well under a second.

On Windows 8 or Ubuntu, if I type "motepad" instead of "notepad", it doesn't know what the hell I am talking about and returns no results and possibly an album I don't want from an online music store I never use.

They're not even doing /good/ copies of original source. They don't understand what they're copying, so they just imitate the surface representation.

Result: we have hugely powerful computers, running huge and hugely-complex OSes, but at the end of the day, they don't really do anything that Windows for Workgroups didn't. Disks, directories, data files and binaries. Network client and server built in, big deal. Programs open files in proprietary formats, often over proprietary protocols using reverse-engineered clients for proprietary services.

Same old stuff, but now rendered in true-colour in hardware 3D.

It hasn't progressed. There's been no real material benefit in a decade, and in the decade before that, they just got the stuff they'd already delivered working properly.

It looks nicer now. It's a lot more stable and reliable, too. And there are some nice little functional improvements - even a crappy broken search tool is better than none. When I can and did get an 8-gig quad-core PC of Freecycle, I don't mind spending some megs of RAM and CPU cycles on self-healing stuff in the background.

But I had hoped for more. I'm not talking about jetpacks and flying cars here, but in the 1980s I had powerful capable programming languages that didn't require volumes of documentation - they were designed with the express intention of being easy, even for children.

Now, they're designed to be powerful instead - with vast libraries of code for manipulating text files or building rich web apps or what-have-you.

BBC BASIC was orders of magnitude smaller and simpler, yet it was rich enough and fast enough and expressive enough that Sophie Wilson used it, from choice, to model the ISA of what is now the world's most successful CPU architecture. These were kid's tools but they weren't toys. But the simplicity, the accessibility has been lost.

So this led me to go digging. What happened, when and where? What went wrong?

It's easy to go back to the origins of the IBM PC, of the ARM and Acorn, of Sinclair, but there is no insight there. The early tools in this family were simple only because they were built to a price and they couldn't afford anything other than simplicity. The inherent technical restrictions of the time compelled it to be clean and minimal. If we used enhanced Archimedes descendants today, or enhanced Sinclair boxes, they'd be no better than what we have: the really nasty bits of the PC design - conventional memory, segmented memory, register starvation, they're all fading memories now.

(Indeed, fascinatingly, in the former Communist countries, the Spectrum /did/ evolve into a 16/32-bit desktop PC with ISA slots. You can see glimpses of that parallel universe. It's not pretty. Nor are the descendants of the Amiga OS and so on.)

So I dug deeper. What did Unix try to grow up into, before a Finnish genius cloned it and freed it forever from the Unix Wars, by trapping it in its own monolithic past.

That leads to Plan 9 and to microkernels and all sorts of exotic fun. Fascinating, but all blind alleys, really. Unix was a low-end, basic answer to perceived 1960s bloat - Multics and so on. It succeeded; it outlived Multics. It also resulted in the extinction of some of the cool tech that it didn't implement from Multics and so on, such as multi-ring security models.

So the Unix line is not rewarding either. Different answers from the 8-bit home-micro stuff, but only a little different. They even tried to converge, repeatedly - Cromix and OS-9 and things. Didn't work. The hardware was too limited then.

So I dug into different directions.

One was GUIs. Linux GUIs are basically knock-offs of Windows. (Got an article out of that.) Windows is a knock-off of the Mac. (That story is *way* too old and familiar, even though most Windows advocates still don't understand the real details of what Apple did.)

Apple's GUI was a knock-off of Xerox's. That's well-known too.

But *here*, here there is a story that is almost untold. The heartbreak of the Xerox engineers - that is occasionally mentioned. But *why* they were heartbroken is not.

Apple only ripped off one-third of the idea. It independently re-invented the other third later on - the hooks were left in.

It ripped off the presentation layer - the GUI. And it massively enhanced it to make it useful on a standalone desktop computer with traditional apps on a traditional single-tasking OS with a traditional filesystem. It changed the world, but it was only the cosmetic angle.

And the other third - the Mac had networking built in. Poor slow serial networking, but much later, they added in the other Xerox tech - Ethernet.

But the missing third was the OS behind the GUI. The GUI was the visible expression of a whole new way of programming, of a live system of interrelated objects. An OS with no binaries, no data files, no fixed static filesystem. It didn't even have "source code" which passed through a "compiler" to generate "object code" that then went through a "linker" to make "executables". All that 1960s guff was superceded.

And Apple's techies didn't even /notice/ that. They took how it /looked/ and built a copy on top of a 1960s-style OS.

/That's/ what broke the Xerox folks' hearts. The core, the most clever part - that's the part that got left behind and ended up a forgotten footnote.

Later, some ex-Apple guys went and tried to rebuild part of this, but on the basis of Unix and an enhanced C. The result is NeXT and in the end theirs became the most valuable company in the history of the world. And that's based on a pale broken shadow of the real idea, mixed in with enough legacy stuff to make it not too scary and alien.

(That's one big story. I've not really dug into that one yet - I think the parts that are known are too well-known, they eclipse the important bit.)

And after those guys had left, some other Apple guys tried to reinvent the OS and the GUI again, with a new effort that ran on a new RISC chip and dispensed with a filesystem and so on and ran in a wonderful new readable high-level language - but that got canned. A crippled shadow of the product got released, arguably the most sophisticated commercial GUI and platform the world has seen to date, and it bombed and got canned too.

The product that got out was the Newton. The language that /should/ have powered it was Dylan. Read up on Dylan and see a startling vision of a world that could have been.

OK, so that didn't spring full-formed from its creators' brows. So where did that come from?

Aha. It came from Lisp. And behind that is another story. A story of another very different kind of computing, of a different kind of programming language, of a different kind of programming altogether... One that today is almost forgotten but whose fading memory now erupts as joke "memes":

http://www.catonmat.net/blog/what-would-john-mccarthy-say-if-he-saw-you-programming/

I am not yet sure but it seems to me that maybe the Smalltalk stuff, the Xerox workstations, all that, took part of its genesis from an effort to make a more accessible, human-readable, even kid-friendly interpretation of the same ideas that the Lisp pioneers had.

Smalltalk boxes and Lisp machines seem to be almost brothers in some ways. Compared to them, there is so little difference between Ubuntu and Windows 8 that you'd need a microscope to spot it.

That is the real story here. It's the biggest I have ever even glimpsed in my field. And it is so big that when I try to tell people, try to discuss it, that they don't even believe that it could be real. They just say "oh well Python does that today" or "who cares when I have Debian?"

It's like a real-life version of the elephant in the room. Everyone else thinks it's a wall of the room. Actually, they're in a cubbyhole that the elephant is leaning against. The whole world of computing is in that cubbyhole. As far as I can tell, there are about a dozen people writing or speaking today who have noticed this, in the entire world of computing from ENIAC and Turing to Windows 8.1 and Ubuntu.

I want to get people to notice that the wall has wrinkles and hairs on it and that it's breathing.
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

July 2025

S M T W T F S
  1234 5
6789101112
13141516171819
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 20th, 2025 02:49 am
Powered by Dreamwidth Studios