In my head, although it is perhaps hard to formulate, and certainly not that linear, I’d feel there are more categories, and there are certainly machines that span boundaries… and possibly a class of machines that fit before the mainframes as described? (For example, unless I’m mis-remembering my computing history, the 'batch schedulers' of early computers were the Operators who ran them?)
Delineating 'micros' as a single category is problematic — as some have said already, especially with Workstations spanning the era of late minis to higher powered desktops. Some late minis made use of better silicon integration using bit-slice chips in implementation, as did early Workstations like Alto and PERQ. Some mini CPUs found there way into silicon maybe requiring one or a few chips to implement. PDP-11 was implemented in silicon as LSI-11, for example.
As things moved into the desktop age, 'micros' started to require additional chips for what we might consider now basic CPU functions — memory controllers, MMUs, FPUs. How much during the PC era are Desktops and Servers true single chip CPUs in the same way as 8-bit Home computers, given that they can’t run without a specifically designed chipset.
And then how much does a modern Desktop or Server fall into the category of 'mainframe' above? There are so many processors in a modern PC — not just the overt ones like CPUs and GPUs, but the microcontrollers embedded here, there and everywhere.
And then we start to think about the regression to mainframe/computer bureau architecture of cloud computing, where the super-computer on our desks merely provides an old mainframe-style terminal to applications running on a big central mainframe-style remote computing service.
So personally, I see a computer lineage and categorisation chart at best like a diagram of a complicated railway and at worst like a bowl of spaghetti 😊
no subject
Date: 2022-10-24 11:49 am (UTC)Delineating 'micros' as a single category is problematic — as some have said already, especially with Workstations spanning the era of late minis to higher powered desktops. Some late minis made use of better silicon integration using bit-slice chips in implementation, as did early Workstations like Alto and PERQ. Some mini CPUs found there way into silicon maybe requiring one or a few chips to implement. PDP-11 was implemented in silicon as LSI-11, for example.
As things moved into the desktop age, 'micros' started to require additional chips for what we might consider now basic CPU functions — memory controllers, MMUs, FPUs. How much during the PC era are Desktops and Servers true single chip CPUs in the same way as 8-bit Home computers, given that they can’t run without a specifically designed chipset.
And then how much does a modern Desktop or Server fall into the category of 'mainframe' above? There are so many processors in a modern PC — not just the overt ones like CPUs and GPUs, but the microcontrollers embedded here, there and everywhere.
And then we start to think about the regression to mainframe/computer bureau architecture of cloud computing, where the super-computer on our desks merely provides an old mainframe-style terminal to applications running on a big central mainframe-style remote computing service.
So personally, I see a computer lineage and categorisation chart at best like a diagram of a complicated railway and at worst like a bowl of spaghetti 😊