liam_on_linux: (Default)

[Repurposed from Stack Exchange, here]
The premise in the question is incorrect. There were such chips. The question also fails to allow for the way that the silicon-chip industry developed.
Moore's Law basically said that every 18 months, it was possible to build chips with twice as many transistors for the same amount of money.
The 6502 (1975) is a mid-1970s design. In the '70s it cost a lot to use even thousands of transistors; the 6502 succeeded partly because it was very small and simple and didn't use many, compared to more complex rivals such as the Z80 and 6809.
The 68000 (1979) was also from the same decade. It became affordable in the early 1980s (e.g. Apple Lisa) and slightly more so by 1984 (Apple Macintosh). However, note that Motorola also offered a version with an 8-bit external bus, the 68008, as used in the Sinclair QL. This reduced performance, but it was worth it for cheaper machines because it was so expensive to have a 16-bit chipset and 16-bit memory.
Note that just 4 years separates the 6502 and 68000. That's how much progress was being made then.
The 65C816 was a (partially) 16-bit successor to the 6502. Note that WDC also designed a 32-bit successor, the 65C832. Here is a datasheet: https://downloads.reactivemicro.com/Electronics/CPU/WDC%2065C832%20Datasheet.pdf
However, this was never produced. As a 16-bit extension to an 8-bit design, the 65C816 was compromised and slower than pure 16-bit designs. A 32-bit design would have been even more compromised.

Note, this is also why Acorn succeeded with the ARM processor: its clean 32-bit-only design was more efficient than Motorola's combination 16/32-bit design, which was partly inspired by the DEC PDP-11 minicomputer. Acorn evaluated the 68000, 65C816 (which it used in the rare Acorn Communicator), NatSemi 32016, Intel 80186 and other chips and found them wanting. Part of the brilliance of the Acorn design was that it effectively used slow DRAM and did not need elaborate caching or expensive high-speed RAM, resulting in affordable home computers that were nearly 10x faster than rival 68000 machines.
The 68000 was 16-bit externally but 32-bit internally: that is why the Atari machine that used it was called the ST, short for "sixteen/thirty-two".
The first fully-32 bit 680x0 chip was the 68020 (1984). It was faster but did not offer a lot of new capabilities, and its successor the 68030 was more successful, partly because it integrated a memory management unit. Compare with the Intel 80386DX (1985), which did much the same: 32-bit bus, integral MMU.
The 80386DX struggled in the market because of the expense of making 32-bit motherboards with 32-bit wide RAM, so was succeeded by the 80386SX (1988), the same 32-bit core but with a half-width (16-bit) external bus. This is the same design principle as the 68008.
Motorola's equivalent was the fairly rare 68EC020.
The reason was that around the end of the 1980s, when these devices came out, 16MB of memory was a huge amount and very expensive. There was no need for mass-market chips to address 4GB of RAM — that would have cost hundreds of thousands of £/$ at the time. Their 32-bit cores were for performance, not capacity.
The 68030 was followed by the 68040 (1990), just as the 80386 was followed by the 80486 (1989). Both also integrated floating-point coprocessors into the main CPU die. The progress of Moore's Law had now made this affordable.
The line ended with the 68060 (1994), but still 32-bit — but again like Intel's 80586 family, now called "Pentium" because they could't trademark numbers — both have Level 1 cache on the CPU die.
The reason was because at this time, fabricating large chips with millions of transistors was still expensive, and these chips could still address more RAM than was remotely affordable to fit into a personal computer.
So the priority at the time was to find way to spend a limited transistor budget on making faster chips: 8-bit → 16-bit → 32-bit → integrate MMU → integrate FPU → integrate L1 cache
This line of development somewhat ran out of steam by the mid-1990s. This is why there was no successor to the 68060.
Most of the industry switched to the path Acorn had started a decade earlier: dispensing with backwards compatibility with now-compromised 1970s designs and starting afresh with a stripped-down, simpler, reduced design — Reduced Instruction Set Computing (RISC).
ARM chips supported several OSes: RISC OS, Unix, Psion EPOC (later renamed Symbian), Apple NewtonOS, etc. Motorola's supported more: LisaOS, classic MacOS, Xenix, ST TOS, AmigaDOS, multiple Unixes, etc.
No single one was dominant.
Intel was constrained by the success of Microsoft's MS-DOS/Windows family, which sold far more than all the other x86 OSes put together. So backwards-compatibility was more important for Intel than for Acorn or Motorola.
Intel had tried several other CPU architectures: iAPX-432, i860, i960 and later Itanium. All failed in the general-purpose market.
Thus, Intel was forced to to find a way to make x86 quicker. It did this by breaking down x86 instructions into RISC-like "micro operations", re-sequencing them for faster execution, running them on a RISC-like core, and then reassembling the results into x86 afterwards. First on the Pentium Pro, which only did this efficiently for x86-32 instructions, when many people were still running Windows 95/98, an OS composed of a lot of x86-16 code and which ran a lot of x86-16 apps.
Then with the Pentium II, an improved Pentium Pro with onboard L1 (and soon after L2) cache and improved x86-16 optimisation — but also around the time that the PC market moved to Windows XP, a fully x86-32 OS.
In other words, even by the turn of the century, the software was still moving to 32-bit and the limits of 32-bit operation (chiefly, 4GB RAM) were still largely theoretical. So, the effort went into making faster chips with the existing transistor budget.
Only by the middle of the first decade of the 21st century did 4GB become a bottleneck, leading to the conditions for AMD to create a 64-bit extension to x86.
The reasons that 64-bit happened did not apply in the 1990s.
From the 1970s to about 2005, 32 bits were more than enough, and CPU makers worked on spending the transistor budgets on integrating more go-faster parts into CPUs. Eventually, this strategy ran out, when CPUs included the integer core, a floating-point core, a memory management unit, a tiny amount of L1 cache and a larger amount of slower L2 cache.
Then, there was only 1 way to go: integrate a second CPU onto the chip. Firstly as a separate CPU die, then as dual-core dies. Luckily, by this time, NT had replaced Win9x, and NT and Unix could both support symmetrical multiprocessing.
So, dual-core chips, then quadruple-core chips. After that, a single user on a desktop or laptop gets little more benefit. There are many CPUs with more cores but they are almost exclusively used in servers.
Secondly, the CPU industry was now reaching limits of how fast silicon chips can run, and how much heat they emit when doing so. The megahertz race ended.
So the emphases changed, to two new ones, as the limiting factors became:

  • the amount of system memory

  • the amount of cooling they required

  • the amount of electricity they used to operate

These last two things are two sides of the same coin, which is why I said two not three.
Koomey's Law has replaced Moore's Law.

liam_on_linux: (Default)
A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.

But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.

The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.

Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)

The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)

It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.


I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:



Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.

The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.

The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.

The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.

The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.

The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.

The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.

Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.

But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.

The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.

And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.

The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.

The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.

All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!

In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS, which nearly became the next-generation Amiga OS. That could have shaken up the industry -- it was truly radical.

And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX. It didn't happen, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals.

But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.

So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.

But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.

Funny how things turn out.

June 2025

S M T W T F S
1234567
891011121314
15161718192021
22 232425262728
2930     

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 26th, 2025 01:55 pm
Powered by Dreamwidth Studios