The future comes at you fast
Nov. 6th, 2018 01:38 pmI run Linux, and I work for a Linux vendor. I'm typing on Linux right now.
But I still don't particularly like Linux. I like OS X a bit better, but I miss aspects of classic MacOS. I am playing with Haiku with some interest -- it's getting there. And I'm fiddling with Oberon, mainly for research.
But all this stuff is very mature now, meaning hidebound and inflexible and showing its age badly. Thus all the patching over the cracks with VMs and containers and automated deployment tools and devops and all that.
Basically I think we're just for a huge shift in end-user-facing computing. It's time for a generation shift. I've worked through 1 big one of these, and maybe 2 or 3 small ones.
I was actively involved, as a working professional, in the shift from text-based computing (mostly DOS, a little SCO and Concurrent CP/M & Concurrent DOS stuff, and bits and bobs of other older stuff, mostly terminal-based) to GUI-based computing.
The hardware and OS makers survived. Few app vendors did. It basically killed most of the big DOS app vendors: WordStar, Lotus, WordPerfect, Ashton-Tate.
This is prehistory to younger pros these days: it was in the 1990s, while they were children.
Then there were smaller shifts:
[1] from predominantly stand-alone machines to LANs
[2] from proprietary LANs to TCP/IP
(and the switch from DOS-plus-Windows to Win9x and NT here and there)
[3] connecting those LANs to the Internet
(and the partly-16-bit to purely-32-bit switch to a world of mostly NT servers talking to NT workstations.)
Then once we were all on NT, there were two relatively tiny ones:
* multi-processor workstations becoming the default (plus GPUs as standard, leading to compositing desktops everywhere)
* the (remarkably seamless) transition from 32-bit to 64-bit.
But the big one was DOS to Windows (and a few Macs). It's a shame most people have forgotten about it. It was a huge, difficult, painful, gradual, and very expensive shift.
And at the time, most people in the business pooh-poohed GUIs.
For a full decade, there were multiple families of successful, capable, GUI-based computers, with whole ecosystems of apps and 3rd party vendors -- not just the Mac, but the Amiga, and the Atari ST, and the Acorn ARM machines, and at the high end graphical UNIX workstations...
And the DOS world staidly ignored all of it. They were toys. GUIs were toys for children, except very expensive ones for graphic designers, in which casee they were too niche to matter. Macs weren't real computers, they were for the rich or the simple-minded.
[Insert discussion about intersectionality and toxic masculinity here.]
Windows 3 was all very well but it was still DOS underneath and ran DOS apps and it was just a pretty face. The more serious you were, the more DOS apps you ran. Accountants didn't use Windows. Well maybe except Excel, but that didn't count. (Har har.)
It was still a sign of Manliness to be able to drive a CLI.
(It still is in the FOSS world, where people take great pride in their skills with horrible 1970s text editors.)
There was real scorn for so-called toy interfaces for toy computers, when people have work to do, etc.
Then, once people showed that you could actually do real work using these alleged toys, it switched to being inefficiency: it was claimed to be a waste of computer power driving all those pixels. Then when the computer power became plentiful enough for it not to be a problem, it was a waste of money equipping everyone with big screens and graphical displays and lots of RAM.
This kind of crap held back real progress for about a decade, I reckon. I mean, MS was singularly slow off the ball too, partly due to wasting (in hindsight) a lot of time and effort on OS/2. Even then, OS/2 1.0 came without a GUI at all, in 1988 IIRC, because it wasn't finished yet. (One could argue this showed commendable dedication to getting the underpinnings right first. Possibly. If the underpinnings had been right, which is debatable.)
DR GEM showed that mid-1980s PCs were perfectly able to drive a useful, productive GUI. The early Amstrad 8086 machines shipped with GEM, along with a GEM programming language, a (very basic) GEM word processor, a GEM paint program, etc. It even ran usefully on machines without a hard disk!
Windows 3.0 (1990) was all right. Good enough for some use. Benefited a lot from a 286 and at least 1MB of RAM, though.
Windows 3.1 (1992) was useful. Really wanted a 386 and at least 2MB, ideally 4MB.
NT 3.1 and WfWg were both 1993. WfWg was useful but already looking old-fashioned, whereas NT wanted a £5K+ PC to work well.
It was 1995 before a version that ran on an ordinary computer and gave unambiguous, demonstrable benefits to basically all users came along. That's what OS/2 should have been on high-end 286s a decade earlier.
Then, suddenly, a full decade after the Amiga and the ST, we got Win95 and suddenly everyone wanted them.
Few lessons were learned from this shift.
We haven't had such a big generation shift in 25 years, which means that now there are lots of middle-aged pros who don't really remember the last one. They've never worked through one.
Now we're facing another big shift, and again, although the signs are here, nobody takes them seriously. The writing is on the wall as it was in the late 1980s and early 1990s, and nobody is seeing it.
To spell it out:
* Keyboardless computers are huge. Smartphones, tablets, tills and other touchscreen devices.
* Most of them are all-solid-state: just RAM and flash.
* The first non-volatile RAM is on sale now. I just wrote a whole new manual chapter on it, for a boring enterprise OS. It's becoming mainstream.
* It's ~10× cheaper than DRAM and ~10× faster than Flash. This is the early, v1.0 kit, note.
Soon we will have the first new generation of computers since the 8-bit microcomputer revolution of the late 1970s. All-nonvolatile-RAM machines. They will not have the distinction between RAM and disk storage that all computers since about the 1950s have had. This is a bigger shift than minicomputers were, than the micro was.
It will, of course, be perfectly possible to adapt disk-based OSes to them and run them on these machines, partitioning the storage into a fake-disk bit and a live-RAM bit. But it will be inefficient and
pointless to shuffle all this data around like that -- however, it is an assumption so completely implicit in every OS in the world today (except one¹) that it is insurmountable.
Try to imagine Unix without a filesystem. It doesn't really work. Take away the notion of a "file" and basically all current OSes sort of fall apart.
But there is a certain mindset that I encounter very often who find the concept very hard to even imagine, and are extremely hostile to it.
Which is exactly the sort of thing I saw in the era of the transition from DOS (and text-only Unix) to ubiquitous GUIs.
Both Unix and Windows are cultures now.
One could argue that devotees of any OS or platform ever were, sure -- but in the 20th century, most platforms came and went relatively quickly. A decade and they had been invented, thrived, flowered, there was an explosion of apps, peripherals, wide support, and then they faltered and were gone.
This meant that most enthusiasts of any particular make or series of computer had exposure to quite a few others, too. And TBH while every 1980s computer had strengths and virtues -- OK, almost every -- they all had weaknesses and rivals which were stronger in those particular areas.
Now, much less so.
Now there are only 2 platforms -- Windows or Unix -- and they both mainly run on x86 with a toehold on ARM. They're mature enough that the CPU doesn't make a ton of difference.
There are lots of flavours of Unix, and some are very different to others. However a lot of the old-time 20th-century Linux enthusiasts I know, or know of, have switched to Mac OS X now, basically for an easier life. The rivalries are much smaller-scale: free vs commercial, BSD vs Linux, distro rivalries, desktop rivalries, and of course the eternal editor wars.
Step back far enough and the 2 are very clearly siblings with very similar conceptual models.
Low-level code is in C, stuff layered on top is in slightly higher-level languages (both compiled and interpreted, both generally imperative, usually object-oriented). Performance-critical stuff is compiled to static CPU-specific binaries and libraries, stored as files in a hierarchical filesystem along with config info stored in text files, some sort of system-wide database or both. There's a rigid distinction between "software" and "data" but both are kept in files which may be visible to the user or hidden, but this is a cosmetic difference. Users switch between different "applications" to accomplish defined tasks; data interchange between these is limited, often difficult, usually involves complex translations, and as a result is often one-way, as round-tripping means data loss.
There are of course dozens of dead OSes which conform to exactly this model, from all the beloved 1980s home computers (ST, Amiga, Mac, Acorns, QL, different vendors' Unix boxes, etc.).
But the point is that now, this 2-family model has been totally pervasive for about 30 years. Anyone younger than 25 has basically never seen anything else.
Compare these systems to some older ones, and the differences are startling. Compare with Classic MacOS, for something close to home. Not a single config file anywhere, no shell, no CLI, no scripting, nothing like that at all.
Compare with a Xerox Smalltalk box, or a Lisp machine, or Tao Intent, or Vitanuova Inferno, or colorForth, and you will see multiple radically different approaches to OS design... but all of them are basically forgotten now.
I think we lost something really important in the last 25y, and rediscovering it is going to be very painful.
I very much hope that bold, innovative new OSes come along that exploit this new design fully and richly.
But it's almost inconceivable to users of current tools. Partly because the concepts of filesystems are so very powerful, it's almost unimaginable that you would want to throw that away.
Which is very close to what DOS power users said in 1990, and I think is just as valid. (FTAOD, that means it's not valid at all.)
It's an interesting time.
P.S. at least for now, I think the current model suits servers very well, and just like GUI ideas matured in their own market sector of weird 680x0 computers that were unconnected from the x86 PC market
until it caught up a decade late, I think the server side of things will trundle on for another decade plus without this stuff having any impact.
I also suspect that as these new machines rise to dominance, as with previous shifts, pretty much no existing vendors, of hardware or OSes or apps, will survive the change. Entirely new companies will come along and get huge.
¹ IBM i. That is, OS/400. I doubt it will suddenly become very relevant.
[Adapted from some list posts, in lieu of real content.]
But I still don't particularly like Linux. I like OS X a bit better, but I miss aspects of classic MacOS. I am playing with Haiku with some interest -- it's getting there. And I'm fiddling with Oberon, mainly for research.
But all this stuff is very mature now, meaning hidebound and inflexible and showing its age badly. Thus all the patching over the cracks with VMs and containers and automated deployment tools and devops and all that.
Basically I think we're just for a huge shift in end-user-facing computing. It's time for a generation shift. I've worked through 1 big one of these, and maybe 2 or 3 small ones.
I was actively involved, as a working professional, in the shift from text-based computing (mostly DOS, a little SCO and Concurrent CP/M & Concurrent DOS stuff, and bits and bobs of other older stuff, mostly terminal-based) to GUI-based computing.
The hardware and OS makers survived. Few app vendors did. It basically killed most of the big DOS app vendors: WordStar, Lotus, WordPerfect, Ashton-Tate.
This is prehistory to younger pros these days: it was in the 1990s, while they were children.
Then there were smaller shifts:
[1] from predominantly stand-alone machines to LANs
[2] from proprietary LANs to TCP/IP
(and the switch from DOS-plus-Windows to Win9x and NT here and there)
[3] connecting those LANs to the Internet
(and the partly-16-bit to purely-32-bit switch to a world of mostly NT servers talking to NT workstations.)
Then once we were all on NT, there were two relatively tiny ones:
* multi-processor workstations becoming the default (plus GPUs as standard, leading to compositing desktops everywhere)
* the (remarkably seamless) transition from 32-bit to 64-bit.
But the big one was DOS to Windows (and a few Macs). It's a shame most people have forgotten about it. It was a huge, difficult, painful, gradual, and very expensive shift.
And at the time, most people in the business pooh-poohed GUIs.
For a full decade, there were multiple families of successful, capable, GUI-based computers, with whole ecosystems of apps and 3rd party vendors -- not just the Mac, but the Amiga, and the Atari ST, and the Acorn ARM machines, and at the high end graphical UNIX workstations...
And the DOS world staidly ignored all of it. They were toys. GUIs were toys for children, except very expensive ones for graphic designers, in which casee they were too niche to matter. Macs weren't real computers, they were for the rich or the simple-minded.
[Insert discussion about intersectionality and toxic masculinity here.]
Windows 3 was all very well but it was still DOS underneath and ran DOS apps and it was just a pretty face. The more serious you were, the more DOS apps you ran. Accountants didn't use Windows. Well maybe except Excel, but that didn't count. (Har har.)
It was still a sign of Manliness to be able to drive a CLI.
(It still is in the FOSS world, where people take great pride in their skills with horrible 1970s text editors.)
There was real scorn for so-called toy interfaces for toy computers, when people have work to do, etc.
Then, once people showed that you could actually do real work using these alleged toys, it switched to being inefficiency: it was claimed to be a waste of computer power driving all those pixels. Then when the computer power became plentiful enough for it not to be a problem, it was a waste of money equipping everyone with big screens and graphical displays and lots of RAM.
This kind of crap held back real progress for about a decade, I reckon. I mean, MS was singularly slow off the ball too, partly due to wasting (in hindsight) a lot of time and effort on OS/2. Even then, OS/2 1.0 came without a GUI at all, in 1988 IIRC, because it wasn't finished yet. (One could argue this showed commendable dedication to getting the underpinnings right first. Possibly. If the underpinnings had been right, which is debatable.)
DR GEM showed that mid-1980s PCs were perfectly able to drive a useful, productive GUI. The early Amstrad 8086 machines shipped with GEM, along with a GEM programming language, a (very basic) GEM word processor, a GEM paint program, etc. It even ran usefully on machines without a hard disk!
Windows 3.0 (1990) was all right. Good enough for some use. Benefited a lot from a 286 and at least 1MB of RAM, though.
Windows 3.1 (1992) was useful. Really wanted a 386 and at least 2MB, ideally 4MB.
NT 3.1 and WfWg were both 1993. WfWg was useful but already looking old-fashioned, whereas NT wanted a £5K+ PC to work well.
It was 1995 before a version that ran on an ordinary computer and gave unambiguous, demonstrable benefits to basically all users came along. That's what OS/2 should have been on high-end 286s a decade earlier.
Then, suddenly, a full decade after the Amiga and the ST, we got Win95 and suddenly everyone wanted them.
Few lessons were learned from this shift.
We haven't had such a big generation shift in 25 years, which means that now there are lots of middle-aged pros who don't really remember the last one. They've never worked through one.
Now we're facing another big shift, and again, although the signs are here, nobody takes them seriously. The writing is on the wall as it was in the late 1980s and early 1990s, and nobody is seeing it.
To spell it out:
* Keyboardless computers are huge. Smartphones, tablets, tills and other touchscreen devices.
* Most of them are all-solid-state: just RAM and flash.
* The first non-volatile RAM is on sale now. I just wrote a whole new manual chapter on it, for a boring enterprise OS. It's becoming mainstream.
* It's ~10× cheaper than DRAM and ~10× faster than Flash. This is the early, v1.0 kit, note.
Soon we will have the first new generation of computers since the 8-bit microcomputer revolution of the late 1970s. All-nonvolatile-RAM machines. They will not have the distinction between RAM and disk storage that all computers since about the 1950s have had. This is a bigger shift than minicomputers were, than the micro was.
It will, of course, be perfectly possible to adapt disk-based OSes to them and run them on these machines, partitioning the storage into a fake-disk bit and a live-RAM bit. But it will be inefficient and
pointless to shuffle all this data around like that -- however, it is an assumption so completely implicit in every OS in the world today (except one¹) that it is insurmountable.
Try to imagine Unix without a filesystem. It doesn't really work. Take away the notion of a "file" and basically all current OSes sort of fall apart.
But there is a certain mindset that I encounter very often who find the concept very hard to even imagine, and are extremely hostile to it.
Which is exactly the sort of thing I saw in the era of the transition from DOS (and text-only Unix) to ubiquitous GUIs.
Both Unix and Windows are cultures now.
One could argue that devotees of any OS or platform ever were, sure -- but in the 20th century, most platforms came and went relatively quickly. A decade and they had been invented, thrived, flowered, there was an explosion of apps, peripherals, wide support, and then they faltered and were gone.
This meant that most enthusiasts of any particular make or series of computer had exposure to quite a few others, too. And TBH while every 1980s computer had strengths and virtues -- OK, almost every -- they all had weaknesses and rivals which were stronger in those particular areas.
Now, much less so.
Now there are only 2 platforms -- Windows or Unix -- and they both mainly run on x86 with a toehold on ARM. They're mature enough that the CPU doesn't make a ton of difference.
There are lots of flavours of Unix, and some are very different to others. However a lot of the old-time 20th-century Linux enthusiasts I know, or know of, have switched to Mac OS X now, basically for an easier life. The rivalries are much smaller-scale: free vs commercial, BSD vs Linux, distro rivalries, desktop rivalries, and of course the eternal editor wars.
Step back far enough and the 2 are very clearly siblings with very similar conceptual models.
Low-level code is in C, stuff layered on top is in slightly higher-level languages (both compiled and interpreted, both generally imperative, usually object-oriented). Performance-critical stuff is compiled to static CPU-specific binaries and libraries, stored as files in a hierarchical filesystem along with config info stored in text files, some sort of system-wide database or both. There's a rigid distinction between "software" and "data" but both are kept in files which may be visible to the user or hidden, but this is a cosmetic difference. Users switch between different "applications" to accomplish defined tasks; data interchange between these is limited, often difficult, usually involves complex translations, and as a result is often one-way, as round-tripping means data loss.
There are of course dozens of dead OSes which conform to exactly this model, from all the beloved 1980s home computers (ST, Amiga, Mac, Acorns, QL, different vendors' Unix boxes, etc.).
But the point is that now, this 2-family model has been totally pervasive for about 30 years. Anyone younger than 25 has basically never seen anything else.
Compare these systems to some older ones, and the differences are startling. Compare with Classic MacOS, for something close to home. Not a single config file anywhere, no shell, no CLI, no scripting, nothing like that at all.
Compare with a Xerox Smalltalk box, or a Lisp machine, or Tao Intent, or Vitanuova Inferno, or colorForth, and you will see multiple radically different approaches to OS design... but all of them are basically forgotten now.
I think we lost something really important in the last 25y, and rediscovering it is going to be very painful.
I very much hope that bold, innovative new OSes come along that exploit this new design fully and richly.
But it's almost inconceivable to users of current tools. Partly because the concepts of filesystems are so very powerful, it's almost unimaginable that you would want to throw that away.
Which is very close to what DOS power users said in 1990, and I think is just as valid. (FTAOD, that means it's not valid at all.)
It's an interesting time.
P.S. at least for now, I think the current model suits servers very well, and just like GUI ideas matured in their own market sector of weird 680x0 computers that were unconnected from the x86 PC market
until it caught up a decade late, I think the server side of things will trundle on for another decade plus without this stuff having any impact.
I also suspect that as these new machines rise to dominance, as with previous shifts, pretty much no existing vendors, of hardware or OSes or apps, will survive the change. Entirely new companies will come along and get huge.
¹ IBM i. That is, OS/400. I doubt it will suddenly become very relevant.
[Adapted from some list posts, in lieu of real content.]