liam_on_linux: (Default)
Mainframes: the first style of computer. Primarily designed as a batch-oriented system, meaning that they are not directly interactive. Jobs are queued up, run without interaction, results stored, and then the next job processed. Later models added interactivity as a secondary feature, usually like most mainframe I/O handled by intelligent peripherals which in effect are networked to the main processors. So, the terminal, on its own, shows a form and then handles all user input as the user completes the form, without communicating with the host at all. Then, when the user signals that the form is complete, the entire contents, maybe many pages, are sent as a single message to the host.

Separate processors in storage, in terminals, in networking controllers, in printers, in everything. Typically the machine cannot actually drive any output or input directly (e.g. mouse movements, or keystrokes, or anything): peripherals do that, collect and encode the results, and send them over a network. So as someone else commented, a mainframe isn't really a computer, it's a whole cluster of closely-coupled computers, many with dedicated functionality. Quite possibly all implemented with different architectures, instruction sets, programming languages, bit widths, everything.

Here's an article I wrote about a decade back about how IBM added the new facility of "time sharing" -- i.e. multiple users on terminals, all interacting with the host at the same time -- by developing the first hypervisor and running 1 OS per user in VMs, because the IBM OSes of the time simply could not handle concepts like "terminal sessions".

Minicomputer: the 2nd main style of computer. Smaller, cheaper, so affordable for a company department to own one, while mainframes were and are mostly leased to whole corporations. Typically 1 CPU in early ones, implemented as a whole card cage full of multiple boards: maybe a few boards with registers, one with an adder, etc. The "CPU" is a cabinet with thousands of chips in it.

Hallmarks: large size; inherently multitasking, so that multiple users can share the machine, accessing it via dumb terminals on serial lines. User presses a key, the keystroke is sent over the wire, the host displays it. Little to no networking. One processor, maybe several disk or tape drives, also dumb and controlled by the CPU. No display on the host. No keyboard or other user input on the host. All interaction is via terminals. But because they were multiuser, even the early primitive ones had fairly smart OSes which could handle multiple user accounts and so on.

Gradually acquired networking and so on, but later.

In some classic designs, like some DEC PDPs, adding new hardware actually adds new instructions to the processor's instruction set.

Wildly variable and diverse architectures. Word lengths of 8 bit, 9 bit, 12 bit, 18 bit, 24 bit, 32 bit, 36 bit and others. Manufacturers often had multiple incompatible ranges and maybe several different OSes per model range, depending on the task wanted, so offering dozens of totally different and incompatible OSes across half a dozen models.

Microcomputer: the simplest category. The entire processor is implemented on a single silicon chip, a microprocessor. Early machines very small and simple, driven by 1 terminal with 1 user. No multitasking, no file or other resource sharing, no networking, no communications except typically 1 terminal and maybe a printer. Instead of 1 computer per department, 1 computer per person. Facilities added by standardised expansion cards.

This is the era of standardisation and commoditisation. Due largely to microcomputers, things like the size of bytes, their encoding and so on were fixed. 8 bits to a byte, ASCII coding, etc.

Gradually grew larger: 16-bit, then 32-bit, etc. In the early '80s gained onboard ROM typically with a BASIC interpreter, on-board graphics and later sound. Mid-'80s, went to 16-bit with multicolour graphics (256+ colours), stereo sound. Lots of incompatible designs, but usually 1 OS per company, used for everything. All single-user boxes.

These outperformed most minis and minis died out. Some minis gained a hi-res graphical display and turned into single-user deskside "workstations", keeping their multitasking OS, usually a UNIX by this point. Prices remained at an order of magnitude more than PCs, and processors were proprietary, closely-guarded secrets, and sometimes still implemented across multiple discrete chips. Gradually these got integrated into single chip devices but they usually weren't very performance competitive and got displaced by RISC processors, built to run compiled C quickly.

In the '90s, generalising wildly, networking became common, and 32-bit designs became affordable. Most of the 16-bit machines died out and the industry standardised on MS Windows and classic MacOS. As internet connections became common in the late '90s, multitasking and a GUI were expected along with multimedia support.

Apple bought NeXT, abandoned its proprietary OS and switched to a UNIX.

Microsoft headhunted DEC's team from the cancelled MICA project, merged it with the Portable OS/2 project, and got them to finish OS/2 NT, later Windows NT, on the N-Ten CPU, the Intel i860, a RISC chip, then on MIPS, and later on x86-32 and other CPUs. This was the first credible commercial microcomputer OS that could be both a client and a server, and ultimately killed off all the proprietary dedicated-server OSes, of which the biggest was Novell Netware.

That is a vastly overlong answer but it's late and I'm just braindumping.

Mainframe: big tightly-clustered bunch of smart devices, flying in close formation. Primary role, batch driven computing, non-interactive; interactivity bolted on later.

Mini: departmental shared computer with dumb terminals and dumb peripherals, interactive and multitasking from the start. Most text-only, with interactive command-line interfaces -- "shells" -- and multiprogramming or multitasking OSes. Few had batch capabilities; no graphics, no direct I/O, maybe rare graphical terminals for niche uses. Origins of the systems that inspired CP/M, MS-DOS, VMS, UNIX, and NT.

Micro: single-chip CPU, single-user machines, often with graphics and sound early on. Later gained GUIs, then later than that networking, and evolved to be servers as well.

If the machine can control and be controlled by a screen and keyboard plugged into the CPU, it's a micro. If its CPU family has always been a single chip from the start, it's a micro. If it boots into some kind of firmware OS loader, it's probably a micro. The lines between micros and UNIX workstations are a bit blurred.

May 2025

S M T W T F S
    12 3
45678910
11121314151617
1819 2021222324
25262728293031

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 14th, 2025 11:59 pm
Powered by Dreamwidth Studios