The Decade of Linux on the Desktop. You're in it.
When it happened, many Unix folks don't consider it a _real_ Unix. Even thought just a few years later, and AIUI after spending a _lot_ on the exercise, Apple got the UNIX™ branding.
That is roughly when I entered the computer industry.
Dv/X was remarkable tech, and if it had shipped earlier could have changed the course of the industry. Sadly, it came too late. Dv/X was rumoured then, but the state of the art was OS/2 1.1, released late 1988 and the first version of OS/2 with a GUI.
Dv/X was not released until about 5Y later... 1992. That's the same year as Windows 3.1, but critically, Windows 3.0 was in 1990, 2 years earlier.
Windows 3.0 was a result of the flop of OS/2 1.x.
OS/2 1.x was a new 16-bit multitasking networking kernel -- but that meant new drivers.
MS discarded the radical new OS, it discarded networking completely (until later), and moved the multitasking into the GUI layer, allowing Win3 to run on top of the single-tasking MS-DOS kernel. That meant excellent compatibility: it ran on almost anything, can it could run almost all DOS apps, and multitask them. And thanks to a brilliant skunkworks project, mostly by one man, David Weise, assisted by Murray Sargent, it combined 3 separate products (Windows 2, Windows/286 and Windows/386) into a single product that ran on all 3 types of PC and took good advantage of all of them. I wrote about its development here: https://www.theregister.com/2025/01/18/how_windows_got_to_v3...
It also did bring in some of the GUI design from OS/2 1.1, mainly from 1.2, and 1.3 -- the Program Manager and File Manager UI, the proportional fonts, the fake-3D controls, some of the Control Panel, and so on. It kept the best user-facing parts and threw away the fancy invisible stuff underneath which was problematic.
Result: smash hit, redefined the PC market, and when Dv/X arrived it was doomed: too late, same as OS/2 2.0, which came out the same year as Dv/X.
If Dv/X had come out in the late 1980s, before Windows 3, it could have changed the way the PC industry went.
Dv/X combined the good bits of DOS, 386 memory management and multitasking, Unix networking and Unix GUIs into an interesting value proposition: network your DOS PCs with Unix boxes over Unix standards, get remote access to powerful Unix apps, and if vendors wanted, it enabled ports of Unix apps to this new multitasking networked DOS.
In the '80s that could have been a contender. Soon afterwards it was followed by Linux and the BSDs, which made that Unix stuff free and ran on the same kit. That would have been a great combination -- Dv/X PCs talking to BSD or Linux servers, when those Unix boxes didn't really have useful GUIs yet.
Windows 3 offered a different deal: it combined the good bits of DOS, OS/2 1.x's GUI, and Windows 2.x into a whole that ran on anything and could run old DOS apps and new GUI apps, side by side.
Networking didn't follow until Windows for Workgroups which followed Windows 3.1. Only businesses wanted that, so MS postponed it. Good move.It is one of the oddest things in computing that stuff to me, as a big kid of heading for 60 years old but who still feels quite young and enjoys learning and exploring, that the early history of Linux – a development that came along mid-career for me – and indeed Unix, which was taking shape when I was a child, is mysterious lost ancient history now to those working in the field.
It’s not that long ago. It’s well within living memory for lots of us who are still working with it in full time employment. Want to know why this command has that weird switch? Then go look up who wrote it and ask him. (And sadly yes there’s a good chance it’s a “him”.)
Want to know why Windows command switches are one symbol and Unix ones another? Go look at the OSes the guys who wrote them ran before. They are a 2min Google away and emulators are FOSS. Just try them and you can see what they learned from.
This stuff isn’t hieroglyphics. It’s not carved on the walls of tombs deep underground.
The reason that we have Snap and Flatpak and AppImage and macOS .app
is all stuff that happened since I started my first job. I was there. So were thousands of others. I watched it take shape.
But now, I write about how and why and I get shouted at by people who weren’t even born yet. It’s very odd.
To me it looks like a lot of people spend thousands of developer-hours flailing away trying to rewrite stuff that I deployed in production in my 30s and they have no idea how it’s supposed to work or what they’re trying to do. They’re failing to copy a bad copy of a poor imitation.
Want to know how KDE 6 should have been? Run Windows 95 in VirtualBox and see how the original worked! But no, instead, the team flops and flails adding 86 more wheels to a bicycle and then they wonder why people choose a poor-quality knock-off of a 2007 iPhone designed by people who don’t know why the iPhone works like that.
I am, for clarity, talking about GNOME >3. And the iPhone runs a cut down version of Mac OS X Tiger’s “Dashboard” as its main UI.The personal histories involved are highly relevant and they are one of the things that get forgotten in boring grey corporate histories.
Bill Gates didn't get lucky: he got a leg up from mum & dad, and was nasty and rapacious and fast, and clawed his way to industry dominance. On the way he climbed over Gary Kildall of Digital Research and largely obliterated DR.
Ray Noorda of Novell was the big boss of the flourishing Mormon software industry of Utah. (Another big Utah company was WordPerfect.)
Several of them were in the Canopy Group:
https://en.wikipedia.org/wiki/Canopy_Group
Ray Noorda owned the whole lot, via NFT Ventures Inc., which stood for "Noorda Family Trust".
https://en.wikipedia.org/wiki/Ray_Noorda
Caldera acquired the Unix business from SCO, as my current employers reported a quarter of a century ago:
https://www.theregister.com/2000/08/02/caldera_goes_unix_with_sco/
Noorda managed to surf Gates's and Microsoft's wave. Novell made servers, with their own proprietary OS, and workstations, with their own OS, and the network. As Microsoft s/w on IBM-compatible PCs became dominant, Novell strategically killed off first its workstations and pivoted to cards for PCs and clients for DOS. Then it ported its server OS to PC servers, and killed its server hardware. Then it was strong and secure and safe for a while, growing fat on the booming PC business.
But Noorda knew damned well that Gates resented anyone else making good money of DOS systems. In the late 1980s, when DR no longer mattered, MS screwed IBM because IBM fumbled OS/2. MS got lucky with Windows 3.
MS help screw DEC and headhunted DEC's head OS man Dave Cutler and his core team and gave him the leftovers of the IBM divorce: "Portable OS/2", the CPU-independent version. Cutler turned Portable OS/2 into what he had planned to turn DEC VMS into: a cross-platform Unix killer. It ended up being renamed "OS/2 NT" and then "Windows NT".
Noorda knew it was just a matter of time 'til MS had a Netware-killer. He was right. So, he figured 2 things would help Novell adapt: embrace the TCP/IP network standard, and Unix.
And Novell had cash.
So, Novell bought Unix and did a slightly Netwarified Unix: UnixWare.
He also spied that the free Unix clone Linux would be big and he spun off a side-business to make a Linux-based Windows killer, codenamed "Corsair" -- a fast-moving pirate ship.
Corsair became Caldera and Caldera OpenLinux. The early version was expensive and had a proprietary desktop, but it also had a licensed version of SUN WABI). Before WINE worked, Caldera OpenLinux could run Windows apps.
Caldera also bought the rump of DR so it also had a good solid DOS as well: DR-DOS.
Then Caldera were the first corporate Linux to adopt the new FOSS desktop, KDE. I got a copy of Caldera OpenLinux with KDE from them. Without a commercial desktop it was both cheaper and better than the earlier version. WABI couldn't run much but it could run the core apps of MS Office, which was what mattered.
So, low end workstation, Novell DOS; high end workstation, Caldera OpenLinux (able to connect to Novell servers, and run DOS and Windows apps); legacy servers, Netware; new open-standards app servers, UnixWare.
Every level of the MS stack, Novell had an alternative. Server, network protocol, network client/server, low end workstation, high end workstation.
Well, it didn't work out. Commercial Unix was dying; UnixWare flopped. Linux was killing it. So Caldera snapped up the dying PC Unix vendor, SCO, and renamed itself "SCO Group", and now that its corporate ally, the also-Noorda-owned-and-backed Novell owned the Unix source code, SCO Group tried to kill Linux by showing it was based on stolen Unix code, and later when that failed, that it contained stolen Unix code.
Caldera decided DOS wasn't worth having and open sourced it. (I have a physical copy from them.) Lots of people were interested. It realised DOS was still worth money, reverse course and made the next version non-FOSS again. It also offered me a job. I said no. I like drinking beer. Utah is dry.
The whole sorry saga of the SCO Group and the Unix lawsuits was because Ray Noorda wanted to outdo Bill Gates.
Sadly Noorda got Alzheimer's. The managers who took over tried to back away, but bits of Noorda's extended empire started attacking things which other bits had been trying to exploit. It also shows the danger and power of names.
Now the vague recollection in the industry seems to be "SCO was bad".
No: SCO were good guys and SCO Xenix was great. It wasn't even x86-only: an early version ran on the Apple Lisa, alongside 2 others.
The SCO Group went evil. SCO was fine. SCO != SCO Group.
Caldera was an attempt to bring Linux up to a level where it could compete with Windows, and it was a good product. It was the first desktop Linux I ran as my main desktop OS for a while.
Only one company both owned and sold a UNIX™ and had invested heavily in Linux and had the money to fight the SCO Group: IBM.
IBM set its lawyers on the SCO Group lawsuit and it collapsed.
Xinuos salvaged the tiny residual revenues to be had from the SCO and Novell Unixware product lines.
Who owns the Unix source code? Microfocus, because it owns Novell.
Who sells actual Unix? Xinuos.
Who owns the trademark? The Open Group. "POSIX" (a name coined by Richard Stallman) became UNIX™.
Who owns Bell Labs? AT&T spin off Lucent, later bought by Alcatel, later bought by Nokia.
Was Linux stolen? No.
Does anyone care now? No.
Did anyone ever care? No, only Ray Noorda with a determined attempt to out-Microsoft Microsoft, which failed.Haiku is a recreation of a late-1990s OS. News for you: in the 1990s and until then, computers didn't do power management.
The US government had to institute a whole big programme to get companies to add power management.
https://en.wikipedia.org/wiki/Energy_Star
Aggressive power management is only a thing because silicon vendors lie to their customers. Yes, seriously.
From the mid-1970s for about 30 years, adding more transistors meant computers got faster. CPUs went from 4-bit to 8-bit to 16-bit to 32-bit, then there was a pause while they gained onboard memory management (Intel 80386/Motorola 68030 generation) then scalar execution and onboard hardware floating point (80486/68040 generation), then onboard L1 cache (Pentium), then superscalar execution and near-board L2 cache (Pentium II), then onboard L2 (Pentium III), then they ran out of ideas to spend CPU transistors on, so the transistor budget went on RAM instead, meaning we needed 64-bit CPUs to track it.
The Pentium 4 was an attempt to crank this as high as it would go by running as fast as possible and accepting a low IPC (instructions per clock). It was nicknamed the fanheater. So Intel US pivoted to Intel Israel's low-power laptop chip with aggressive power management. Voilà, the Core and then Core 2 series.
Then, circa 2006-2007, big problem. 64-bit chips had loads of cache on board, they were superscalar, decomposing x86 instructions into micro ops, resequencing them for optimal execution with branch prediction, they had media and 3D extensions like MMX2, SSE, SSE2, they were 64-bit with lots of RAM, and there was nowhere to spend the increasing transistor budget.
Result, multicore. Duplicate everything. Tell the punters it's twice as fast. It isn't. Very few things are parallel.
With an SMP-aware OS, like NT or BeOS or Haiku, 2 cores make things a bit more responsive but no faster.
Then came 3 and 4 cores, and onboard GPUs, and then heterogenous cores, with "efficiency" and "performance" cores... but none of this makes your software run faster. It's marketing.
You can't run all the components of a modern CPU at once. It would burn itself out in seconds. Most of the chip is turned off most of the time, and there's an onboard management core running its own OS, invisible to user code, to handle this.
Silicon vendors are selling us stuff we can't use. If you turned it all on at once, instant self-destruction. We spend money on transistors that must spend 99% of the time turned off. It's called "dark silicon" and it's what we pay for.
In real life, chips stopped getting Moore's Law speed increases 20 years ago. That's when we stopped getting twice the performance every 18 months.
All the aggressive power management and sleep modes are to help inadequate cooling systems stop CPUs instantly incinerating themselves. Hibernation is to disguise how slowly multi-gigabyte OSes boot. You can't see the slow boot if it doesn't boot so often.
For 20 years the CPU and GPU vendors have been selling us transistors we can't use. Power management is the excuse.
Update your firmware early and often. Get a nice fast SSD. Shut it down when you're not using it: it reboots fast.
Enjoy a fast responsive OS that doesn't try to play the Win/Lin/Mac game of "write more code to use the fancy accelerators and hope things go faster".
It had a CF card slot, so you could even remove your storage card and insert a CF Wifi card instead, and have mobile Internet in your pocket, 20 years ago!
But if you did, you got a free extra with a wifi adaptor – a battery life of about 15-20 minutes.
It was clever, but totally useless. With the wifi card in, you couldn’t have external storage any more, so there was very little room left.
I had to check: https://uk.pcmag.com/first-looks/30821/sharp-zaurus-sl-5500
64MB RAM, 16MB flash, and a 320x240 screen. Or rather 240x320 as it was portrait.
The sheer amount of thought and planning that went into the Linux-based Zaurus was shown by the fact that the tiny physical keyboard had no pipe symbol. Bit of a snag on an xNix machine, that.
Both mine were 2nd hand, given to me by techie mates who’d played with them and got bored and moved on. I'm told others got better battery life on Wifi. Maybe their tiny batteries were already on the way out or something.
Fun side-note #1: I do not remember the battery pack looking like this one, though. I feel sure I would have noticed.
https://www.amazon.co.uk/Battery-Zaurus-SL-5500-900mAh-Li-ion/dp/B007K0DRIU
Fun side-note #2: both came with Sharp’s original OS, version 1.0. I had an interesting time experimenting with alternative OS builds, new ROMs etc. Things did get a lot better, or at least less bad, after the first release. But the friend who gave me my first unit swore up and down that he’d update the ROM. I can’t see any possible mechanism for flash memory to just revert to earlier contents on its own, though.
With replacement OS images you had to decide how to partition the device’s tiny amount of storage: some as read-only for the OS, some as read-write, some as swap, etc. The allocations were fixed and if you got it wrong you had to nuke and reload.
This would have been much easier if the device had some form of logical volume management, and dynamically-changeable volume sizes.
Which is a thought I also had repeatedly around 2023-2024 when experimenting with OpenBSD. It uses an exceptionally complex partitioning layout, and if you forcibly simplify it, you (1) run up against the limitations of its horribly primitive partitioning tool and (2) reduce the OS’s security.
I have got just barely competent enough with OpenBSD that between writing this in early 2022 and writing this in late 2024, two and a half years later, I went from “struggling mightily just to get it running at all in a VM” to “able with only some whimpering and cursing to get it dual-booting on bare metal with XP64, NetBSD, and 2 Linux distros.”
But it’s still a horrible horrible experience and some form of LVM would make matters massively easier.
Which is odd because I avoid Linux LVM as much as possible. I find it a massive pain when you don’t need it. However, you need it for Linux full-disk encryption, and one previous employer of mine insisted upon that.
In other words: I really dislike LVM, and I am annoyed by Linux gratuitously insisting on it in situations where it should not strictly speaking be needed – but in other OSes and other situations, I have really wanted it, but it wasn’t available.
A very brief rundown:
If you are using Microsoft tools, you need to load the 386 memory manager, emm386.exe
, in your CONFIG.SYS
file.
But, to do that, you need to load the XMS manager, HIMEM.SYS
, first.
So your CONFIG.SYS
should begin with the lines:
DEVICE=C:\WINDOWS\HIMEM.SYS DEVICE=C:\WINDOWS\EMM386.EXE DOS=HIGH,UMB
4. That's the easy bit. Now you have to find free Upper Memory Blocks to tell EMM386 to use.
5. Do a clean boot with F5 or F8 -- telling it not to process CONFIG.SYS
or run AUTOEXEC.BAT
. Alternatively boot from a DOS floppy that doesn't have them.
6. Run the Microsoft Diagnostics, MSD.EXE
, or a similar tool such as Quartdeck Manifest. Look at the memory usage between 640kB and 1MB. Note, the numbers are in hexadecimal.
7. Look for unused blocks that are not ROM or I/O. Write down the address ranges.
8. An example: if you do not use monochrome VGA you can use the mono VGA memory area: 0xB000-0xB7FF.
9. One by one, tell EMM386 to use these. First choose if you want EMS (Expanded Memory Services) or not. It is useful for DOS apps, but not for Windows apps.
10. If you do, you need to tell it:
DEVICE=C:\WINDOWS\EMM386.EXE RAM
And set aside 64kB for a page frame, for example by putting this on the end of the line:
FRAME=E0000
Or, tell it not to use one:
FRAME=none
11. Or disable EMS:
DEVICE=C:\WINDOWS\EMM386.EXE NOEMS
12. Important Add these parameters one at a time, and reboot and test, every single time, without exception.
13. Once you told it which you want now you need to tell it the RAM blocks to use, e.g.
DEVICE=C:\WINDOWS\EMM386.EXE RAM FRAME=none I=B000-B7FF
Again, reboot every time to check. Any single letter wrong can stop the PC booting. Lots of testing is vital. Every time, run MSD and look at what is in use or is not in use. Make lots of notes, on paper.
14. If you find EMM386 is trying to use a block that it mustn't you can eXclude it:
DEVICE=C:\WINDOWS\EMM386.EXE RAM X=B000-B7FF
The more blocks you can add, the better.
15. After this -- a few hours' work -- now you can try to populate your new UMBs.
16. Device drivers: do this by prefixing lines in CONFIG.SYS
with DEVICEHIGH
instead of DEVICE
.
Change:
DEVICE=C:\DOS\ANSI.SYS
To:
DEVICEHIGH=C:\DOS\ANSI.SYS
17. Try every driver, one by one, rebooting every time.
18. Now move on to loadable Terminate and Stay Resident (TSR) programs. Prefix lines that run a program in AUTOEXEC.BAT
with LH
, which is short for LOADHIGH
.
Replace:
MOUSE
With:
LH MOUSE
Use MSD and the MEM
command -- MEM /c /p
-- to identify all your TSRs, note their sizes, and load them all high.
This is a day or two's work for a novice. I could do it in only an hour or two and typically get 625kB or more base memory free, and I made good money from this hard-won skill.
https://github.com/lproven/usb-dos
This is very rushed and the instructions are incomplete. Only FAT16 for now; FAT32 coming real soon now.
But there's still a lot of power in that festering ball of 1980s code.
In 6 weeks in 2016, I drafted, wrote, illustrated, laid out and submitted a ~330 page technical maintenance manual for a 3D printer, solo, entirely in MS Word from start to finish. I began in Word 97 & finished it in Word 2003, 95% of the time running under WINE on Linux... and 90% of the time, using it in Outline Mode, which is a *vastly* powerful writer's tool which the FOSS word has nothing even vaguely comparable to.
But as a novice... Yeah, what the tweet said. It's a timeless classic IMHO.
Some Emacs folks told me Org-mode is just as good as an outliner. I've tried it. This was my response.Org mode compared to Word 2003 Outline View is roughly MS-DOS Edlin compared to Emacs. It's a tiny fragmentary partial implementation of 1% of the functionality, done badly, with a terrible *terrible* UI.
No exaggeration, no hyperbole, and there's a reason I specifically said 2003 and nothing later.
I've been building and running xNix boxes since 1988. I have often tried both Vi and Emacs over nearly 4 decades. I am unusual in terms of old Unix hands: I cordially detest both of them.
The reason I cite Word 2003 is that that's the last version with the old menu and toolbar UI. Everything later has a "ribbon" and I find it unusable.
Today, the web-app/Android/iOS versions of Word do not have Outline View, no. Only the rich local app versions do.
But no, org-mode is not a better richer alternative; it is vastly inferior, to the point of being almost a parody.
It's really not. I tried it, and I found it a slightly sad crippled little thing that might be OK for managing my to-do list.
Hidden behind Emacs' *awful* 1970s UI which I would personally burn in a fire rather than ever use.
So, no, I don't think it's a very useful or capable outliner from what I have seen. Logseq has a better one.
To extend my earlier comparison:
Org-mode to Word's Outline View is Edlin to Emacs.
Logseq to Outline View is MS-DOS 5 EDIT to Emacs: it's a capable full-screen text editor that I know and like and which works fine. It's not very powerful but what it does, it does fine.
Is Org-mode aimed at something else? Maybe, yes. I don't know who or what it's aimed at, so I can't really say.
Word Outline Mode is the last surviving 1980s outliner, an entire category of app that's disappeared.
http://outliners.com/default.html
It's a good one but it was once one among many. It is, for me, *THE* killer feature of MS Word, and the only thing I keep WINE on my computers for.
It's a prose writer's tool, for writing long-form documents in a human language.
Emacs is a programmer's editor for writing program code in programming languages.
So, no, they are not the same thing, but the superficial similarity confuses people.
I must pick a fairly small example as I'm not very familiar with Emacs.
In Outline Mode, a paragraph's level in the hierarchy is tied with its paragraph style. Most people don't know how to use Word's style sheets, but think of HTML. Word has 9 heading levels, like H1...H9 on the Web, plus Body Text, which is always the lowest level.
As you promote or demote a paragraph, its style automatically changes to match.
(This has the side effect that you can see the level from the style. If that bothered you, in old versions you could turn off showing the formatting.)
As you move a block of hierarchical text around the outline all its levels automatically adopt the correct styles for their current location.
This means that when I wrote a manual in it, I did *no formatting by hand* at all. The text of the entire document is *automatically* formatted according to whether it's a chapter heading, or section, or subsection, or subsubsection, etc.
When you're done Word can automatically generate a table of contents, or an index, or both, that picks up all those section headings. Both assign page numbers "live", so if you move, add or delete any section, the ToC and index update immediately with the new positions and page numbers.
I say a small example as most professional writers don't deal with the formatting at all. That's the job of someone else in a different department.
Or, in technical writing, this is the job of some program. It's the sort of thing that Linux folks get very excited about LaTeX and LyX, or for which documentarians praise DocBook or DITA, but I've used both of those and they need a*vast* amount of manual labour -- and *very* complex tooling.
XML etc are also *extremely* fragile. One punctuation mark in the wrong place and 50 pages of formatting is broken or goes haywire. I've spent days troubleshooting one misplaced `:`. It's horrible.
Word can do all this automatically, and most people *don't even know the function is there.* It's like driving an articulated lorry as a personal car and never noticing that it can carry 40 tonnes of cargo! Worse still, people attach a trailer and roofrack and load them up with stuff... *because they don't know their vehicle can carry 10 cars already* as a built in feature.
I could take a sub sub section of a chapter and promote it to a chapter in its own right, and adjust the formatting of 100 pages, in about 6 or 8 keystrokes. That will also rebuild the index and redo the table of contents, automatically, for me.
All this can be entirely keyboard driven, or entirely mouse driven, according to the user's preference. Or any mixture of both, of course. I'm a keyboard warrior myself. I can live entirely without a pointing device and it barely slows me down.
You can with a couple of clicks collapse the whole book to just chapter headings, or just those and subheadings, or just all the headings and no body text... Any of 9 levels, as you choose. You can hide all the lower levels, restructure the whole thing, and then show them again. You can adjust formatting by adjusting indents in the overview, and then expand it again to see what happened and if it's what you want.
You could go crazy... zoom out to the top level, add a few new headings, indent under the new headings, and suddenly in a few clicks, your 1 big book is now 2 or 3 or 4 smaller books, each with its own set of chapters, headings, sub headings, sub sub headings etc. Each can have its own table of contents and index, all automatically generated and updated and formatted.
I'm an xNix guy, mainly. I try to avoid Windows as much as possible, but the early years of my career were supporting DOS and then Windows. There is good stuff there, and credit where it's due.
(MS Office on macOS also does this, but the keyboard UI is much clunkier.)
Outliners were just an everyday tool once. MS just built a good one into Word, way back in the DOS era. Word for DOS can do all this stuff too and it did it in like 200kB of RAM in 1988!
Integrating it into a word processor makes sense, but they were standalone apps.
It's not radical tech. This is really old, basic stuff. But somehow in the switch to GUIs on the PC, they got lost in the transition.
And no, LibreOffice/Abiword/CalligraWords has nothing even resembling this.
There are 2 types of outliner: intrinsic and extrinsic, also known as 1-pane or 2-pane.
https://en.wikipedia.org/wiki/Outliner#Layout
There are multiple 2-pane outliners that are FOSS.
But they are tools for organising info, and are almost totally useless for writers.
There are almost no intrinsic outliners in the FOSS world. I've been looking for years. The only one I know is LoqSeq, but it is just for note-taking and it does none of the formatting/indexing/ToC stuff I mentioned. It does handle Markdown but with zero integration with the outline structure.
So it's like going from Emacs to Notepad. All the clever stuff is gone, but you can still edit plain text.
This is Chris's "Some thoughts on Computers" – the final, edited form.
The basic design of computers hasn't changed much since the mechanical one, the Difference Engine, invented by Charles Babbage in 1822 – but not built until 1991.
Ada Lovelace was the mathematical genius who saw the value in Babbage’s work, but it was Alan Turing who invented computer science, and the ENIAC in 1945 was arguably the first electronic general-purpose digital computer. It filled a room. The Micral N was the world's first “personal computer,” in 1973.
Since then, the basic design has changed little, other than to become smaller, faster, and on occasions, less useful.
The current trend to lighter, smaller gadget-style toys – like cell phones, watches, headsets of various types, and other consumer toys – is an indication that the industry has fallen into the clutches of mainstream profiteering, with very little real innovation now at all.
I was recently looking for a new computer for my wife and headed into one of the main laptop suppliers only to be met with row upon row of identical machines, at various price points arrived at by that mysterious breed known as "marketers". In fact, the only difference in the plastic on display was how much drive space had the engineers fitted in, and how much RAM did they have. Was the case a pretty colour, that appealed to the latest 10-year-old-girl, or a rugged he-man, who was hoping to make the school whatever team? In other words, rows of blah.
Where was the excitement of the early Radio Shack "do-it-yourself" range: the Sinclair ZX80, the Commodore 8-bits (PET and VIC-20),which ran the CPM operating system, (one of my favorites) later followed by the C64? What has happened to all the excitement and innovation? My answer is simple: the great big clobbering machine known as "Big Tech".
Intel released its first 8080 processor in 1972 and later followed up with variations on a theme, eventually leading to the 80286, the 80386, the 80486 (getting useful), and so on. All of these variations needed an operating system which basically was a variation of MS-DOS, believed to have been based on QDOS, or "Quick and Dirty Operating System," the work of developer Tim Paterson at a company called Seattle Computer Products (SCP). It was later renamed 86-DOS, after the Intel 8086 processor, and this was the version that Microsoft licensed and eventually purchased. Or alternatively the newer, FOSS, FreeDOS.
Games started to appear, and some of them were quite good. But the main driver of the computer was software.
In particular, word-processors and spreadsheets.
At the time, my lost computer soul had found a niche in CP/M, which on looking back was a lovely little operating system – but quietly disappeared into the badlands of marketing.
Lost and lonely I wandered the computerverse until I hooked up with Sanyo – itself now long gone the way of the velociraptor and other lost prehistoric species.
The Sanyo bought build quality, the so-called "lotus card" to make it fully compatible with the IBM PC, and later, an RGB colour monitor and a 10 meg hard drive. The basic model was still two 5¼" floppy drives, which they pushed up to 720kB, and later the 3.½" 1.25MB floppy drives. Ahead of its time, it too went the way of the dinosaur.
These led to the Sanyo AT-286, which became a mainstay, along with the Commodore 64. A pharmaceutical company had developed a software system for pharmacies that included stock control, ordering, and sales systems. I vaguely remember that machine and software bundle was about NZ$ 15,000, which was far too rich for most. Although I sold many of them over my time.
Then the computer landscape began to level out, as the component manufacturers began to settle on the IBM PC-AT as a compatible, open-market model of computer that met the Intel and DOS standards. Thus, the gradual slide into 10000 versions of mediocrity.
The consumer demand was for bigger and more powerful machines, whereas the industry wanted to make more profits. A conflict to which the basic computer scientists hardly seemed to give a thought.
I was reminded of Carl Jung's dictum that “greed would destroy the West.”
A thousand firms sprang up, all selling the same little boxes, whilst the marketing voices kept trumpeting the bigger/better/greater theme… and the costs kept coming down, as businesses became able to afford these machines, and head offices began to control their outlying branches through the mighty computer.
I headed overseas, to escape the bedlam, and found a spot in New Guinea – only to be overrun by a mainframe which was to be administered from Australia, and was going to run my branch – for which I was responsible, but without having any control.
Which side of the fence was I going to land on? The question was soon answered by the Tropical Diseases Institute in Darwin, which diagnosed dengue fever… and so I returned to NZ.
For months I battled this recurring malady, until I was strong enough to attend a few hardware and programming courses at the local Polytechnic, eventually setting up my own small computer business, building up 386 machines for resale, followed by 486 and eventually a Texas Instrument laptop agency. Which was about 1992 from my now fragile memory. I also dabbled with the Kaypro as a personal beast and it was fun but not as flexible as the Sanyo AT I was using.
The Texas Instruments laptop ran well enough and I remember playing Doom on it, but it had little battery life, and although rechargeable, they needed to be charged every two or three hours. At least the WiFi worked pretty consistently, and for the road warrior, gave a point of distinction.
Then the famous 686 arrived, and by the use of various technologies, RAM began to climb up to 256MB, and in some machines 512MB.
Was innovation happening? No – just more marketing changes. As in, some machines came bundled with software, printers or other peripherals, such as modems, scanners, or even dot matrix printers.
As we ended the 20th century, we bought bigger and more powerful machines. The desktop was being chased by the laptop, until I stood in my favorite computer wholesaler staring at a long row of shiny boxes that were basically all the same, wondering which one my wife would like… knowing that it would have to connect to the so-called "internet", and in doing so, make all sorts of decisions inevitable. As to securing a basically insecure system which would require third part programs of dubious quality and cost.
Eventually I chose a smaller Asus, with 16GB of main RAM and an NVIDIA card, and retreating to my cottage, collapsed in despair. Fifty years of computing and wasted innovation left her with a black box that, when she opened, it said “HELLO” against a big blue background that promised the world – but only offered more of the same. As in, a constant trickle of hackers, viruses, Trojans and barely anything useful – but now included several new perversions called chat-bot or “AI”.
I retired to my room in defeat.
We have had incremental developments, until we have today's latest chips from Intel and AMD based on the 64-bit architecture first introduced around April 2003.
So where is the 128-bit architecture – or the 256 or the 512-bit?
What would happen if we got really innovative? I still remember Bill Gates saying "Nobody will ever need more than 640k of RAM." And yet, it is common now to buy machines with 8 or 16 or 32GB of RAM, because the poor quality of operating systems fills the memory with badly codded garbage that causes memory leaks, stack-overflow errors and other memory issues.
Then there is Unix which I started using at my courses in Christchurch polytechnic. A Dec 10 from memory which also introduced me to the famous or infamous BOFH.
I spent many happy hours chuckling over the BOF’s exploits. Then came awareness of the twin geniuses: Richard Stallman, and from Linus Torvalds, GNU/Linux. A solid, basic series of operating systems, and programs by various vendors, that simply do what they are asked, and do it well.
I wonder where all this could head, if computer manufacturers climbed onboard and developed, for example, a laptop with an HDMI screen, a rugged case with a removable battery, a decent sound system, with a good-quality keyboard, backlight with per-key colour selection. Enough RAM slots to boost the main memory up to say 256GB, and video RAM to 64GB, allowing high speed draws to the screen output.
Throw away the useless touch pads, and gimmicks like second mini screens built in to the chassis. With the advent of Bluetooth mice, they are no longer needed. Instead, include an 8TB NV Me drive, then include a decent set of controllable fans and heat pipes that actually kept the internal temperatures down, so as to not stress the RAM and processors.
I am sure this could be done, given that some manufacturers, such as Tuxedo, are already showing some innovation in this area.
Will it happen? I doubt it. The clobbering machine will strike again.
- - - - -
Having found that I could not purchase a suitable machine for my needs, I wandered throughout the computerverse until I discovered in a friends small computer business an Asus ROG Windows 7 model, in about 2004. It was able to have a RAM upgrade, which I duly carried out, with 2 × 8GB sodim ram plus 4GB of SDDR2 video RAM, and 2×500GB WD 7200RPM spinning rust hard drives. This was beginning to look more like a computer. Over the time I used it, I was able to replace the spinning-rust drives with 500GB Samsung SSDs, and as larger sticks of RAM became available, increased that to the limit as well. I ran that machine, which was Linux-compatible, throwing away the BSOD [Blue Screen Of Death – Ed.] of Microsoft Windows, and putting one of the earliest versions of Ubuntu with GNOME on it. It was computing heaven: everything just worked, and I dragged that poor beast around the world with me.
While in San Diego, I attended Scripps University and lectured on cot death for three months as a guest lecturer.
Scripps at the time was involved with IBM in developing a line-of-sight optical network, which worked brilliantly on campus. It was confined to a couple of experimental computer labs, but you had to keep your fingers off the mouse or keyboard, or your machine would overload with web pages if browsing. I believe it never made it into the world of computers for ordinary users, as the machines of the day could not keep up.
There was also talk around the labs of so-called quantum computing, which had been talked about since the 1960s on and off, but some developments appeared in 1968.
The whole idea sounds great – if it could be made to work at a practicable user level. But in the back of my mind, I had a suspicion that these ideas would just hinder investment and development of what was now a standard of motherboards and BIOS-based systems. Meanwhile, my Tux machine just did what was asked of it.
Thank you, Ian and Debra Murdoch, who developed the Debian version of Linux – on which Ubuntu was based.
I dragged that poor Asus around the Americas, both North and South, refurbishing it as I went. I found Fry's, the major technology shop in San Diego, where I could purchase portable hard drives and so on at a fraction of the cost of elsewhere in the world as well as just about any computer peripheral dreamed of. This shop was a techs heaven so to speak. And totally addictive to some on like me.
Eventually, I arrived in Canada, where I had a speaking engagement at Calgary University – which also had a strong Tux club – and I spent some time happily looking at a few other distros. Distrowatch had been founded about 2001, which made it easy to keep up with Linux news, new versions of Tux, and what system they were based on. Gentoo seemed to be the distro for those with the knowledge to compile and tweak every little aspect of their software.
Arch attracted me at times. But eventually, I always went back to Ubuntu – until I learned of Ubuntu MATE. The University had a pre-release copy of Ubuntu MATE 14.10, along with a podcast from Alan Pope and Martin Wimpress, and before I could turn around I had it on my Asus. It was simple, everything worked, and it removed the horrors of GNOME 3.
I flew happily back to New Zealand and my little country cottage.
Late in 2015, my wife became very unwell after a shopping trip. Getting in touch with some medical friends, they were concerned she’d had a heart attack. This was near the mark: she had contracted a virus which had destroyed a third of her heart muscle. It took her a few years to die, and a miserable time it was for her and for us both. After the funeral, I had rented out my house and bought a Toyota motor home, and I began traveling around the country. I ran my Asus through a solar panel hooked up to an inverter, a system which worked well and kept the beast going.
After a couple of years, I decided to have a look around Australia. My grandfather on my father's side was Australian, and had fascinated us with tales of the outback, where he worked as a drover in the 1930s and ’40s.
And so, I moved to Perth, where my brother had been living since the 1950s.
There, I discovered an amazing thing: a configurable laptop based on a Clevo motherboard – and not only that, the factory of manufacturers Metabox was just up the road in Fremantle.
Hastily, I logged on to their website, and in a state of disbelief, browsed happily for hours at all the combinations I could put together. These were all variations on a theme by Windows 7, (to misquote Paganini) and there were no listing of ACPI records or other BIOS information with which to help make a decision.
I looked at my battered old faithful; my many-times-rebuilt Asus, and decided the time had come. I started building. Maximum RAM and video RAM, latest NVIDIA card, two SSDs, their top-of-the-line WiFi and Bluetooth chip sets, sound cards, etc. Then, as my time in Perth was at an end I got it sent to New Zealand, as I was due to fly back the next day.
That was the first of four Metabox machines I have built, and is still running flawlessly using Ubuntu MATE. I gave it to a friend some years ago and he is delighted with it still.
I had decided to go to the Philippines and South east Asia to help set up clinics for distressed children, something I had already done in South America, and the NZ winter was fast approaching. Hastily I arranged with a church group in North Luzon to be met at Manila airport. I had already contacted an interpreter who was fluent in Versaya and Tagalog, and was an english teacher so we arranged to meet at Manila airport and go on from there.
Packing my trusty Metabox I flew out of Christchurch in to a brand new world.
The so called job soon showed up as a scam and after spending a week or so In Manila I suggested that rather than waste visa we have a look over some of the country. Dimp pointed out her home was on the next Island over and would make a good base to move from.
So we ended up in Cagayan de Ora – the city of the river of gold! After some months of traveling around we decided to get married and so I began the process of getting a visa for Dimp to live in NZ. This was a very difficult process, but with the help of a brilliant immigration lawyer, and many friends, we managed it and next year Dimp becomes a NZ citizen.
My next Metabox was described as a Windows 10 machine, but I knew that it would run Linux beautifully – and so it did. A few tweaks around the ACPI subsystem and it computed away merrily, with not a BSOD in sight. A friend of mine who had popped in for a visit was so impressed with it that he ordered one too, and that arrived about three months later. A quick wipe of the hard drive (thank you, Gparted!), both these machines are still running happily, with not a cloud on the horizon.
One, I gave to my stepson about three months back: a Win 10 machine, and he has taken it back with him to the Philippines, where he reports it is running fine in the tropical heat.
My new Metabox arrived about six weeks ago, and I decided – just out of curiosity – to leave Windows 11 on it. A most stupid decision, but as my wife was running Windows 11 and had already blown it up once, needing a full reset (which, to my surprise, worked), I proceeded to charge it for the recommended 24 hours, and next day, switched it on. “Hello” it said, in big white letters, and then the nonsense began… a torrent of unwanted software proceeded to fill up one of my 8TB NVMe drives, culminating after many reboots with a Chatbot, an AI “assistant”, and something called “Co-pilot”.
“No!” I cried, “not in a million years!” – and hastily plugging in my Ventoy stick, I rebooted it into Gparted, and partitioned my hard drive as ext4 for Ubuntu MATE.
So far, the beast seems most appreciative, and it hums along with just a gentle puff of warm air out of the ports. I needed to do a little tweaking, as the latest NVIDIA cards don’t seem to like Wayland as a graphics server, and the addition to GRUB of acpi=off, and another flawless computer is on the road.
Now, if only I could persuade Metabox to move to a 128-bit system, and can get delivery of that on the other side of the great divide, my future will be in computer heaven.
Oh, if you’re wondering what happened to the Asus? It is still on the kitchen table in our house in the Philippines, in pieces, where I have no doubt it is waiting for another rebuild! Maybe my Stepson Bimbo will do it and give it to his niece. Old computers never die they just get recycled
— Chris Thomas
In Requiem
03/05/1942 — 02/10/2024
This is the second, and I very much fear the last, part of my friend Chris "da Kiwi" Thomas' recollections about PCs, Linux, and more. I shared the first part a few days ago.
Having found that I could not purchase a suitable machine for my needs, I discovered the Asus ROG Windows 7 model, in about 2004. It was able to have a RAM upgrade, which I duly carried out, with 2 × 8GB SO-DIMMs, plus 4GB of SDDR2 video RAM, and 2×500GB WD 7200RPM hard drives. This was beginning to look more like a computer. Over the time I used it, I was able to replace the spinning-rust drives with 500GB Samsung SSDs, and as larger sticks of RAM became available, increased that to the limit as well. I ran that machine, which was Tux-compatible [“Tux” being Chris’s nickname for Linux. – Ed.], throwing away the BSOD [Blue Screen Of Death – that is, Microsoft Windows. – Ed.] and putting one of the earliest versions of Ubuntu with GNOME on it. It was computing heaven: everything just worked, and I dragged that poor beast around the world with me.
While in San Diego, I attended Scripps and lectured on cot death for three months as a guest. Scripps at the time was involved with IBM in developing a line-of-sight optical network, which worked brilliantly on campus. It was confined to a couple of experimental computer labs, but you had to keep your fingers off the mouse or keyboard, or your machine would overload with web pages if browsing. I believe it never made it into the world of computers for ordinary users, as the machines of the day could not keep up.
There was also talk around the labs of so-called quantum computing, which had been talked about since the 1960s on and off, but some developments appeared in 1968.
The whole idea sounds great – if it could be made to work at a practicable user level. But in the back of my mind, I had a suspicion that these ideas would just hinder investment and development of what was now a standard of motherboards and BIOS-based systems. Meanwhile, my Tux machine just did what was asked of it.
Thank you, Ian and Debra Murdoch, who developed the Debian version of Tux – on which Ubuntu was based.
I dragged that poor Asus around the Americas, both North and South, refurbishing it as I went. I found Fry's, the major technology shop in San Diego, where I could purchase portable hard drives and so on at a fraction of the cost of elsewhere in the world.
Eventually, I arrived in Canada, where I had a speaking engagement at Calgary University – which also had a strong Tux club – and I spent some time happily looking at a few other distros. Distrowatch had been founded about 2001, which made it easy to keep up with Linux news, new versions of Tux, and what system they were based on. Gentoo seemed to be the distro for those with the knowledge to compile and tweak every little aspect of their software.
Arch attracted me at times. But eventually, I always went back to Ubuntu – until I learned of Ubuntu MATE. The University had a pre-release copy of Ubuntu MATE 14.10, along with a podcast from Alan Pope and Martin Wimpress, and before I could turn around I had it on my Asus. It was simple, everything worked, and it removed the horrors of GNOME 3.
I flew happily back to New Zealand and my little country cottage.
Late in 2015, my wife became very unwell after a shopping trip. Getting in touch with some medical friends, they were concerned she’d had a heart attack. This was near the mark: she had contracted a virus which had destroyed a third of her heart muscle. It took her a few years to die, and a miserable time it was for her and for us both. After the funeral, I had rented out my house and bought a Toyota motorhome, and I began traveling around the country. I ran my Asus through a solar panel hooked up to an inverter, a system which worked well and kept the beast going.
After a couple of years, I decided to have a look around Australia. My grandfather on my father's side was Australian, and had fascinated us with tales of the outback, where he worked as a drover in the 1930s and ’40s.
And so, I moved to Perth, where my brother had been living since the 1950s.
There, I discovered an amazing thing: a configurable laptop based on a Clevo motherboard – and not only that, their factory was just up the road in Fremantle.
Hastily, I logged on to their website, and in a state of disbelief, browsed happily for hours at all the combinations I could put together. These were all variations on a theme by Windows 7, and there were no listing of ACPI records or other BIOS information.
I looked at my battered old faithful, my many-times-rebuilt Asus, and decided the time had come. I started building. Maximum RAM and video RAM, latest nVidia card, two SSDs, their top-of-the-line WiFi and Bluetooth chipsets, sound cards, etc. Then, I got it sent to New Zealand, as I was due back the next day.
That was the first of four Metabox machines I have built, and is still running flawlessly using Ubuntu MATE.
My next Metabox was described as a Windows 10 machine, but I knew that it would run Tux beautifully – and so it did. A few tweaks around the ACPI subsystem and it computed away merrily, with not a BSOD in sight. A friend of mine who had popped in for a visit was so impressed with it that he ordered one too, and that arrived about three months later. A quick wipe of the hard drive (thank you, Gparted!), both these machines are still running happily, with not a cloud on the horizon.
One, I gave to my stepson about three months back, and he has taken it back with him to the Philippines, where he reports it is running fine in the tropical heat.
My new Metabox arrived about six weeks ago, and I decided – just out of curiosity – to leave Windows 11 on it. A most stupid decision, but as my wife was running Windows 11 and had already blown it up once, needing a full reset (which, to my surprise, worked), I proceeded to charge it for the recommended 24 hours, and next day, switched it on. “Hello” it said, in big white letters, and then the nonsense began… a torrent of unwanted software proceeded to fill up one of my 8TB NVMe drives, culminating after many reboots with a Chatbot, an AI “assistant”, and something called “Co-pilot”.
“No!” I cried, “not in a million years!” – and hastily plugging in my Ventoy stick, I rebooted it into Gparted, and partitioned my hard drive for Ubuntu MATE.
So far, the beast seems most appreciative, and it hums along with just a gentle puff of warm air out of the ports. I needed to do a little tweaking, as the latest nVidia cards don’t seem to like Wayland as a graphics server, and the addition to GRUB of acpi=off, and another flawless computer is on the road.
Now, if only I could persuade Metabox to move to a 128-bit system, and can get delivery of that on the other side of the great divide, my future will be in computer heaven.
Oh, if you’re wondering what happened to the Asus? It is still on the kitchen table in our house in the Philippines, in pieces, where I have no doubt it is waiting for another rebuild!
Chris Thomas
In Requiem
03/05/1942 — 02/10/2024
Some thoughts on Computers
The basic design of computers hasn't changed much since the mechanical one, the Difference Engine, invented by Charles Babbage in 1822 – but not built until 1991. Alan Turing invented computer science, and the ENIAC in 1945 was arguably the first electronic general-purpose digital computer. It filled a room. The Micral N was the world's first “personal computer,” in 1973.
Since then, the basic design has changed little, other than to become smaller, faster, and on occasions, less useful.
The current trend to lighter, smaller gadget-style toys – like cell phones, watches, headsets of various types, and other consumer toys – is an indication that the industry has fallen into the clutches of mainstream profiteering, with very little real innovation now at all.
I was recently looking for a new computer for my wife and headed into one of the main laptop suppliers only to be met with row upon row of identical machines, at various price points arrived at by that mysterious breed known as "marketers". In fact, the only difference in the plastic on display was how much drive space had the engineers fitted in, and how much RAM did they have. Was the case a pretty colour, that appealed to the latest 10-year-old-girl, or a rugged he-man, who was hoping to make the school whatever team? In other words, rows of blah.
Where was the excitement of the early Radio Shack "do-it-yourself" range: the Sinclair ZX80, the Commodore 8-bits (PET and VIC-20), later followed by the C64? What has happened to all the excitement and innovation? My answer is simple: the great big clobbering machine known as "Big Tech".
Intel released its first 8080 processor in 1972 and later followed up with variations on a theme [PDF], eventually leading to the 80286, the 80386, the 80486 (getting useful), and so on. All of these variations needed an operating system which basically was a variation of MS-DOS, or more flexibly, PC DOS. Games started to appear, and some of them were quite good. But the main driver of the computer was software.
In particular, word-processors and spreadsheets.
At the time, my lost computer soul had found a niche in CP/M, which on looking back was a lovely little operating system – but quietly disappeared into the badlands of marketing.
Lost and lonely I wandered the computerverse until I hooked up with Sanyo – itself now long gone the way of the velociraptor and other lost prehistoric species.
The Sanyo bought build quality, the so-called "lotus card" to make it fully compatible with the IBM PC, and later, an RGB colour monitor and a 10 gig hard drive. The basic model was still two 5¼" floppy drives, which they pushed up to 720kB, and later the 3.½" 1.25MB floppy drives. Ahead of its time, it too went the way of the dinosaur.
These led to the Sanyo AT-286, which became a mainstay, along with the Commodore 64. A pharmaceutical company had developed a software system for pharmacies that included stock control, ordering, and sales systems. I vaguely remember that machine and software bundle was about NZ$ 15,000, which was far too rich for most.
Then the computer landscape began to level out, as the component manufacturers began to settle on the IBM PC-AT as a compatible, open-market model of computer that met the Intel and DOS standards. Thus, the gradual slide into 100 versions of mediocrity.
The consumer demand was for bigger and more powerful machines, whereas the industry wanted to make more profits. A conflict to which the basic computer scientists hardly seemed to give a thought.
I was reminded of Carl Jung's dictum: that “greed would destroy the West.”
A thousand firms sprang up, all selling the same little boxes, whilst the marketing voices kept trumpeting the bigger/better/greater theme… and the costs kept coming down, as businesses became able to afford these machines, and head offices began to control their outlying branches through the mighty computer.
I headed overseas, to escape the bedlam, and found a spot in New Guinea – only to be overrun by a mainframe run from Australia, which was going to run my branch – for which I was responsible, but without any control.
Which side of the fence was I going to land on? The question was soon answered by the Tropical Diseases Institute in Darwin, which diagnosed dengue fever… and so I returned to NZ.
For months I battled this recurring malady, until I was strong enough to attend a few hardware and programming courses at the local Polytechnic, eventually setting up my own small computer business, building up 386 machines for resale, followed by 486 and eventually a Texas Instrument laptop agency.
These ran well enough, but had little battery life, and although they were rechargeable, they needed to be charged every two or three hours. At least the WiFi worked pretty consistently, and for the road warrior, gave a point of distinction.
[I think Chris is getting his time periods mixed up here. —Ed.]
Then the famous 686 arrived, and by the use of various technologies, RAM began to climb up to 256MB, and in some machines 512MB.
Was innovation happening? No – just more marketing changes. As in, some machines came bundled with software, printers or other peripherals, such as modems.
As we ended the 20th century, we bought bigger and more powerful machines. The desktop was being chased by the laptop, until I stood at a long row of shiny boxes that were basically all the same, wondering which one my wife would like… knowing that it would have to connect to the so-called "internet", and in doing so, make all sorts of decisions inevitable.
Eventually I chose a smaller Asus, with 16GB of main RAM and an nVidia card, and retreating to my cottage, collapsed in despair. Fifty years of computing and wasted innovation left her with a black box that, when she opened, it said “HELLO” against a big blue background that promised the world – but only offered more of the same. As in, a constant trickle of hackers, viruses, Trojans and barely anything useful – but now included a new perversion called a chat-bot or “AI”.
I retired to my room in defeat.
We have had incremental developments, until we have today's latest chips from Intel and AMD based on the 64-bit architecture first introduced around April 2003.
So where is the 128-bit architecture – or the 256 or the 512-bit?
What would happen if we got really innovative? I still remember Bill Gates saying "Nobody will ever need more than 640k of RAM." And yet, it is common now to buy machines with 8 or 16 or 32GB of RAM, because the poor quality of operating systems fills the memory with poorly-written garbage that causes memory leaks, stack-overflow errors and other memory issues.
Then there is Unix – or since the advent of Richard Stallman and Linus Torvalds, GNU/Linux. A solid, basic series of operating systems, by various vendors, that simply do what they are asked.
I wonder where all this could head, if computer manufacturers climbed onboard and developed, for example, a laptop with an HDMI screen, a rugged case with a removable battery, a decent sound system, with a good-quality keyboard, backlit with per-key colour selection. Enough RAM slots to boost the main memory up to say 256GB, and video RAM to 64GB, allowing high speed draws to the screen output.
Throw away the useless touch pads. With the advent of Bluetooth mice, they are no longer needed. Instead, include an 8TB NVMe drive, then include a decent set of controllable fans and heatpipes that actually kept the internal temperatures down, so as to not stress the RAM and processors.
I am sure this could be done, given that some manufacturers, such as Tuxedo, are already showing some innovation in this area.
Will it happen? I doubt it. The clobbering machine will strike again.
Friday September 20th 2024
Windows (2.01) was the 3rd GUI I learned. First was classic MacOS (System 6 and early System 7.0), then Acorn RISC OS on my own home computer, then Windows.
Both MacOS and RISC OS have beautiful, very mouse-centric GUIs where you must use the mouse for most things. Windows was fascinating because it has rich, well-thought-out, rational and consistent keyboard controls, and they work everywhere. In all graphical apps, in the window manager itself, and on the command line.
-- Ctrl + a letter is a discrete action: do this thing now.
-- Alt + a letter opens a menu
-- Shift moves selects in a continuous range: shift+cursors selects text or files in a file manager. Shift+mouse selects multiple icons in a block in a file manager.
-- Ctrl + mouse selects discontinuously: pick disconnected icons.
-- These can be combined: shift-select a block, then press ctrl as well to add some discontinuous entries.
-- Ctrl + cursor keys moves a word at a time (discontinuous cursor movement).
-- Shift + ctrl selects a word at a time.
In the mid-'90s Linux made Unix affordable and I got to know it, and I switched to it early '00s.
But it lacks that overall cohesive keyboard UI. Some desktops implement most of Windows' keyboard UI (Xfce, LXDE, GNOME 2.x), some invent their own (KDE), many don't have one.
The shell and editors don't have any consistency. Each editor has its own set of keyboard controls, and some environments honour some of them -- but not many because the keyboard controls for an editor make little sense in a window manager. What does "insert mode" mean in a file manager?
They are keyboard-driven windowing environments built by people who live in terminals and only know the extremely limited keyboard controls of the most primitive extant shell environment, one that doesn't honour GUI keyboard UI because it predates it and so in which every app invents its own.
Whereas Windows co-evolved with IBM CUA and deeply embeds it.
The result is that all the Linux tiling WMs I've tried annoy me, because they don't respect the existing Windows-based keystrokes for manipulating windows. GNOME >=3 mostly doesn't either: keystrokes for menu manipulation make little sense when you've tried to eliminate menus from your UI.
Even the growing-in-trendiness MiracleWM because the developer doesn't use plain Ubuntu, he uses Kubuntu, and Kubuntu doesn't respect basic Ubuntu keystrokes like Ctrl+Alt+T for a terminal, so neither does MiracleWM.
They are multiple non-overlapping, non-cohesive, non-uniform keyboard UIs designed by and for people who never knew how to use a keyboard-driven whole-OS UI because they didn't know there was one. So they all built their own ones without knowing that there's 30+ years of prior art for this.
All these little half-thought-out attempts to build something that already existed but its creators didn't know about it.
To extend the prisoners-escaping-jail theme:
Each only extends the one prisoner cell that inmate knew before they got out, where the prison cell is an app -- often a text editor but sometimes it's one game.
One environment lets you navigate by only going left or straight. To go right, turn left three times! Simple!
One only lets you navigate in spirals, but you can adjust the size, and toggle clockwise or anticlockwise.
One is like Asteroids: you pivot your cursor and apply thrust.
One uses Doom/Quake-style WASD + mouse, because everyone knows that, right? It's the standard!
One expects you to plug in a joypad controller and use that.
Not any more. In recent years I've tried GNOME, Xfce, MATE, KDE, Cinnamon, and LXQt on Fedora.
They all look different. They may have some wallpaper in common but that's it. In any of them, there's no way you can glance from across a room (meaning, too far away to read any text or see any logos) and go "oh, yeah, that's Fedora."
And on openSUSE, I tried all of them plus LXDE and IceWM. Same thing. Wallpaper at best.
Same on Ubuntu: I regularly try all the main flavours, as I did here and they all look different. MATE makes an effort, Unity has some of the wallpapers, but that's about it.
If a vendor or project has one corporate brand and one corporate look, usually, time and money and effort went into it. Into logos, colours, tints, gradients, wallpaper, all that stuff.
It seems to me that the least the maintainers of different desktop flavours or spins could do is adopt the official theme and make their remixes look like they are the same OS from the same vendor.
I like Xfce. Its themes aren't great. Many, most, make window borders so thin you can't grab them to resize. Budgie is OK and looks colourful, but Ubuntu Budgie does not look like Ubuntu.
Kubuntu looks like Fedora KDE looks like Debian with KDE looks like anything with KDE, and to my eyes, KDE's themes are horrible, as they have been since KDE 1 -- yes I used 1.0, and liked it -- and only 3rd party distro vendor themes ever made KDE look good.
Only 2 of them, really: Red Hat Linux with Bluecurve, and Corel LinuxOS and Xandros.
Everyone else's KDE skins are horrible. All of them. It's one reason I can't use KDE now. It almost hurts my eyes. (Same goes for TDE BTW.) It is nasty.
Branding matters. Distros all ignore it now. They shouldn't.
And someone somewhere should bring back Bluecurve, or failing that, port GNOME's Adwaita to all the other desktops. I can't stand GNOME but its themes and appearance are the best distro in the West. (Some of the Chinese ones like Deepin and Kylin are beautiful, but everyone's afraid they're full of spyware for the Chinese Communist Party... and they might be right.)
It looks kind of fun, but once again, it does make me wonder why it’s so constrained. Extremely low-res graphics, for instance. TBH I would have sneered at this for being low-end when I was about 13 years old. (Shortly before I got my first computer, a 48K ZX Spectrum.)
Why isn’t anyone trying to make an easy home-build high-end eight-bit? Something that really pushes the envelope right out there – the sort of dream machine I wanted by about the middle of the 1980s.
In 1987 I owned an Amstrad PCW9512:
Later in 1989 I bought an MGT SAM Coupé:
Both had graphics easily outdone by the MSX 2 and later Z80 machines, but those had a dedicated GPU. That might be a reach but then given the limits of a 64 kB memory map, maybe a good one.
Another aspirational machine was the BBC Micro: a expandable, modular OS called MOS; an excellent BASIC, BBC BASIC, with structured flow, named procedures, with local variables, enabling recursive programming, and inline assembly language so if you graduated to machine-code you could just enter and edit it in the BASIC line editor. (Which was weird, but powerful – for instance, 2 independent cursors, one source and one destination, eliminating the whole “clipboard” concept.) Resolution-independent graphics, and graphics modes that cheerfully used most of the RAM, leaving exploitation as an exercise for the developer. Which they rose to magnificently.
The BBC Micro supported dual processors over the Tube interface, so one 6502 could run the OS, the DOS, and the framebuffer, using most of its 64 kB, and Hi-BASIC could run on the 2nd 6502 (or Z80!) processor, therefore having most of 64 kB to itself.
In a 21st century 8-bit, I want something that comfortably exceeds a 1980s 8-bit, let alone a 1990s 8-bit.
(And yes, there were new 8-bit machines in the 1990s, such as the Amstrad CPC Plus range, or MSX Turbo R.)
So my wish list would include…