liam_on_linux: (Default)
[personal profile] liam_on_linux
A friend of mine via the Ubuntu mailing list for the last couple of decades, Chris is bedbound now and tells me he's in his final weeks of life. He shared with me a piece he's written. I've lightly edited it before sharing it, and if he's feeling up to it, there is some more he wants to say. We would welcome thoughts and comments on it.

                                                  Some thoughts on Computers

 

The basic design of computers hasn't changed much since the mechanical one, the Difference Engine, invented by Charles Babbage in 1822 – but not built until 1991. Alan Turing invented computer science, and the ENIAC in 1945 was arguably the first electronic general-purpose digital computer. It filled a room. The Micral N was the world's first “personal computer,” in 1973.

 

Since then, the basic design has changed little, other than to become smaller, faster, and on occasions, less useful.

 

The current trend to lighter, smaller gadget-style toys – like cell phones, watches, headsets of various types, and other consumer toys – is an indication that the industry has fallen into the clutches of mainstream profiteering, with very little real innovation now at all.

 

I was recently looking for a new computer for my wife and headed into one of the main laptop suppliers only to be met with row upon row of identical machines, at various price points arrived at by that mysterious breed known as "marketers". In fact, the only difference in the plastic on display was how much drive space had the engineers fitted in, and how much RAM did they have. Was the case a pretty colour, that appealed to the latest 10-year-old-girl, or a rugged he-man, who was hoping to make the school whatever team? In other words, rows of blah.

 

Where was the excitement of the early Radio Shack "do-it-yourself" range: the Sinclair ZX80, the Commodore 8-bits (PET and VIC-20), later followed by the C64? What has happened to all the excitement and innovation? My answer is simple: the great big clobbering machine known as "Big Tech".

 

Intel released its first 8080 processor in 1972 and later followed up with variations on a theme [PDF], eventually leading to the 80286, the 80386, the 80486 (getting useful), and so on. All of these variations needed an operating system which basically was a variation of MS-DOS, or more flexibly, PC DOS. Games started to appear, and some of them were quite good. But the main driver of the computer was software.


In particular, word-processors and spreadsheets. 


At the time, my lost computer soul had found a niche in CP/M, which on looking back was a lovely little operating system – but quietly disappeared into the badlands of marketing. 


Lost and lonely I wandered the computerverse until I hooked up with Sanyo – itself now long gone the way of the velociraptor and other lost prehistoric species.
 

The Sanyo bought build quality, the so-called "lotus card" to make it fully compatible with the IBM PC, and later, an RGB colour monitor and a 10 gig hard drive. The basic model was still two 5¼" floppy drives, which they pushed up to 720kB, and later the 3.½" 1.25MB floppy drives. Ahead of its time, it too went the way of the dinosaur.


These led to the Sanyo AT-286, which became a mainstay, along with the Commodore 64. A pharmaceutical company had developed a software system for pharmacies that included stock control, ordering, and sales systems. I vaguely remember that machine and software bundle was about NZ$ 15,000, which was far too rich for most.


Then the computer landscape began to level out, as the component manufacturers began to settle on the IBM PC-AT as a compatible, open-market model of computer that met the Intel and DOS standards. Thus, the gradual slide into 100 versions of mediocrity.


The consumer demand was for bigger and more powerful machines, whereas the industry wanted to make more profits. A conflict to which the basic computer scientists hardly seemed to give a thought.

I was reminded of Carl Jung's dictum: that “greed would destroy the West.” 


A thousand firms sprang up, all selling the same little boxes, whilst the marketing voices kept trumpeting the bigger/better/greater theme… and the costs kept coming down, as businesses became able to afford these machines, and head offices began to control their outlying branches through the mighty computer. 


I headed overseas, to escape the bedlam, and found a spot in New Guinea – only to be overrun by a mainframe run from Australia, which was going to run my branch – for which I was responsible, but without any control.


Which side of the fence was I going to land on? The question was soon answered by the Tropical Diseases Institute in Darwin, which diagnosed dengue fever… and so I returned to NZ.


For months I battled this recurring malady, until I was strong enough to attend a few hardware and programming courses at the local Polytechnic, eventually setting up my own small computer business, building up 386 machines for resale, followed by 486 and eventually a Texas Instrument laptop agency.


These ran well enough, but had little battery life, and although they were rechargeable, they needed to be charged every two or three hours. At least the WiFi worked pretty consistently, and for the road warrior, gave a point of distinction.


[I think Chris is getting his time periods mixed up here. —Ed.]


Then the famous 686 arrived, and by the use of various technologies, RAM began to climb up to 256MB, and in some machines 512MB.


Was innovation happening? No – just more marketing changes. As in, some machines came bundled with software, printers or other peripherals, such as modems.

As we ended the 20th century, we bought bigger and more powerful machines. The desktop was being chased by the laptop, until I stood at a long row of shiny boxes that were basically all the same, wondering which one my wife would like… knowing that it would have to connect to the so-called "internet", and in doing so, make all sorts of decisions inevitable.


Eventually I chose a smaller Asus, with 16GB of main RAM and an nVidia card, and retreating to my cottage, collapsed in despair. Fifty years of computing and wasted innovation left her with a black box that, when she opened, it said “HELLO” against a big blue background that promised the world – but only offered more of the same.  As in, a constant trickle of hackers, viruses, Trojans and barely anything useful – but now included a new perversion called a chat-bot or “AI”.


I retired to my room in defeat.

 

We have had incremental developments, until we have today's latest chips from Intel and AMD based on the 64-bit architecture first introduced around April 2003.

 

So where is the 128-bit architecture – or the 256 or the 512-bit?

 

What would happen if we got really innovative? I still remember Bill Gates saying "Nobody will ever need more than 640k of RAM." And yet, it is common now to buy machines with 8 or 16 or 32GB of RAM, because the poor quality of operating systems fills the memory with poorly-written garbage that causes memory leaks, stack-overflow errors and other memory issues.

 

Then there is Unix – or since the advent of Richard Stallman and Linus Torvalds, GNU/Linux. A solid, basic series of operating systems, by various vendors, that simply do what they are asked. 

 

I wonder where all this could head, if computer manufacturers climbed onboard and developed, for example, a laptop with an HDMI screen, a rugged case with a removable battery, a decent sound system, with a good-quality keyboard, backlit with per-key colour selection. Enough RAM slots to boost the main memory up to say 256GB, and video RAM to 64GB, allowing high speed draws to the screen output.

 

Throw away the useless touch pads. With the advent of Bluetooth mice, they are no longer needed. Instead, include an 8TB NVMe drive, then include a decent set of controllable fans and heatpipes that actually kept the internal temperatures down, so as to not stress the RAM and processors.


I am sure this could be done, given that some manufacturers, such as Tuxedo, are already showing some innovation in this area. 


Will it happen? I doubt it. The clobbering machine will strike again.



Friday September 20th 2024 

Date: 2024-09-21 12:31 am (UTC)
paserbyp: (Default)
From: [personal profile] paserbyp
AI and quantum computers are next...

Date: 2024-09-21 03:49 am (UTC)
symbioid: (Default)
From: [personal profile] symbioid
The Sanyo bought build quality, the so-called "lotus card" to make it fully compatible with the IBM PC, and later, an RGB colour monitor and a 10 gig hard drive.

Did he mean 10 megs? Based on what I'm assuming the age of the machine I was thinking megs.
Minor edit/proof: "brought" build quality instead of bought?

"I wonder where all this could head, if computer manufacturers climbed onboard and developed, for example, a laptop with an HDMI screen, a rugged case with a removable battery, a decent sound system, with a good-quality keyboard, backlit with per-key colour selection. Enough RAM slots to boost the main memory up to say 256GB, and video RAM to 64GB, allowing high speed draws to the screen output."

Mention of "rugged" makes me think of those rugged laptops, and their heft, and now I'm picturing the boombox of laptops, old school 1980s boombox not the shrunken down wimpy CD players they call "boomboxes. Especially with a good sound system and chunky keyboard.

Maybe a hybrid between the old "luggables" (the first portable computers before the laptop design, you know the kind).

Speaking of 8TB NVMe, I am really dissapointed by the lack of high capacity SSD/NVMe these days. The cost is still prohibitively expensive (though we are in the 'after times' and it doesn't seem like things will get any more affordable).

One idea I had is something like a usb thumbdrive. a bit thicker, that would basically act like a mouse, without being a fullblown mouse, could plug into a socket/port for charging etc... but springloaded so it pops out when you use the computer and use it like a wireless mouse that can simply plug in without having to be a full blown mouse and doesn't require extra storage for the mouse. Though ergonomics might suck on that (unless you could somehow build it as a collapsable frame that once you pull out the slightly larger than thumbstick sized plug, you could maybe unfold the frame into a more ergonomic setup, IDK).

It seems to me a lot of the issues buying a computer these days is more about looking for general quality and support; not so much technical end of things. They're all the same beige boxes with minor differences in weight and keyboard build quality, but yeah, it's all the same tech underneath mostly, so what's left is : Does this thing have durability, longevity, battery life, and what sort of support if it breaks. Not particularly exciting stuff.

Then again I think it comes to diminishing returns.

Sometimes I think the OS is somewhat the most frustrating part and we're kinda locked into the standard WIMP (windows, icons, menu, pointer) paradigm from Xerox Parc, and we've carved ourselves into a niche (like QWERTY). I am interested in alternative OS's like Bluebottle/A2 from ETH Zurich or Plan9 style UIs.

At this point like you say, we're even bogged into "the phone" - and it seems to me, "they" would just love it if these things turned into TVs with minimal actual interaction using keyboard. Dumbed down scrollmobiles.

"Creation" doesn't mean programming or even typing... it means "recording videos". I don't think it was a conscious effort per se, but the modern phone design (post iPhone/google android), combined with Twitter style UI (what I call "the stream", or I guess you can call it "feed" now... We don't think of the web as "pages" anymore, certainly hyperlinks are also a really dumbed down idea of what they could be; Ted Nelson was slightly right in that regards) plus "The Algorithm" deciding what you read. Finally "The cloud" - they want all the shit up there with their grubby hands on it, not local, not yours. And if you want it you have to ask for it back. Add in the AI bullshit now and "the cloud" gets all the free data you fed it.

It's like we went back to a mainframe oriented device with a feedback loop to us. The local autonomy and control the original PERSONAL computers promised seems to be slowly drifting back to a centralized domain.

I remember in the 90s when "Livecasting" and "Push" was making a big thing (about 1995-1997); ActiveDesktop etc. At the time nobody wanted it, it was yet more dumbed down broadcasting "Feeds" and not about interaction. But I guess they finally got what they wanted. I mean I guess if you think about what Sun was pushing with Javastations as well. This is really where they wanted us to get, it just took them a round about way to get here.

Sorry this isn't necessarily about the topic at hand by the end. Just some random thoughts on the evolution of tech away from the consumer.

Also sending peace for the weeks ahead and the journey into the beyond whatever it means for you.

Date: 2024-09-21 09:56 am (UTC)
history_monk: (Default)
From: [personal profile] history_monk
On 128-bit, 256-bit and so on machines, there doesn't seem to be any need for them yet.

A bit of history: the first widespread architecture with 32-bit addressing was the IBM System/360, announced in 1964. That could address 4GB of RAM; the first models that could have a thousandth of that, 4MB, shipped in 1967.

By the early 1990s, 32-bit addressing was starting to be a limitation. The first processors with 64-bit addressing shipped in 1991-92, and x86 got there in 2003-04. That kind of architecture can address 16EB (exabyte), of RAM. A thousandth of that would be 16PB (petabyte), or 16384TB. It's now 20-30 years since 64-bit addressing was introduced. Nobody builds systems with memories remotely that big: single-figure TB is reasonably common in servers. HP built a 160TB machines in 2017, but it was a one-off, part of a project that didn't work out.

Nobody needs a petabyte machine enough for it to be worth the cost. If someone invented a way to organise a computer to be much faster, or more resistant to breakdowns or security vulnerabilities, by having huge RAM, people would build them. But those inventions haven't happened yet.

RISC-V, which is the newest architecture with claims to be general-purpose, has reserved space in its instruction set for 128-bit addressing. However, nobody has seriously tried to design the instructions, because we need to learn practical lessons from petabyte machines before we design 128-bit ones (those quantities don't have names yet, because nobody uses them).

May 2025

S M T W T F S
    12 3
45678910
11121314151617
1819 2021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 7th, 2025 04:29 pm
Powered by Dreamwidth Studios