liam_on_linux: (Default)

I can’t speak for anyone else but I can tell you why I did it.

I was broke, I know PCs and Macs and Mac OS X – I ran OS X 10.0, 10.1 and 10.2 on a PowerMac 7600 using XPostFacto.

I got the carcase of a Core 2 Extreme PC on my local Freecycle group in 2012.

https://twitter.com/lproven/status/257060672825851904

RAM, no hard disks, no graphics, but case/mobo/CPU/PSU etc.

I took the nVidia card and hard disks from my old Athlon XP. I got the machine running, and thought it was worth a try since it was mostly Intel: Intel chipset, Intel CPU, etc.

I joined some fora, did some reading, used Clover and some tools from TonyMacX86 and so on.

After two days’ work it booted. I got no sound from my SoundBlaster card, so I pulled it, turned the motherboard sound back on, and reinstalled.

It was a learning experience but it worked very well. I ran Snow Leopard on it, as it was old enough to get no new updates that would break my Hack, but new enough that all the modern browsers and things worked fine. (2012 was the year Mountain Lion came out, so I was 2 versions behind, which suited me fine – and it ran PowerPC apps, and I preferred the UI of the PowerPC version of MS Word, my only non-freeware app.)

I had 4 CPU cores, it was maxed out with 8GB RAM, and it was nice and quick. As it was a desktop, I disabled all support for sleep and hibernation: I turn my desktops off at night to save power. It drove a matched pair of 21” CRT monitors perfectly smoothly. I had an Apple Extended keyboard on an ADB-to-USB convertor since my PS/2 ports weren’t supported.

It wasn’t totally reliable – occasionally it failed to boot, but a power cycle usually brought it back. It was fast and pretty stable, it ran all the OS X FOSS apps I usually used, it was much quicker than my various elderly PowerMacs and the hardware cost was essentially £0.

It was more pleasant to use than Linux – my other machines back then ran the still-somewhat-new Ubuntu, using GNOME 2 because Unity hadn’t gone mainstream yet.

Summary: why not? It worked, it gave me a very nice and perfectly usable desktop PC for next to no cost except some time, it was quite educational, and the machine served me well for years. I still have it in a basement. Sadly its main HDD is not readable any more.

It was fun, interesting, and the end result was very usable. At that time there was no way I could have afforded to buy an Intel Mac, but a few years, one emigration and 2 new jobs later, I did so: a 2011 i5 Mac mini which is now my TV-streaming box, but which I used as my main machine until 2017 when I bought a 27” Retina iMac from a friend.

Cost, curiosity, learning. All good reasons in my book.

This year I Hacked an old Dell Latitude E7270, a Core i7 machine maxed out with 16GB RAM – with Big Sur because its Intel GPU isn’t supported in the Monterey I tried at first. It works, but its wifi doesn’t, and I needed to buy a USB wifi dongle. But performance wasn’t great, it took an age to boot with a lot of scary text going past, and it didn’t feel like a smooth machine. So, I pulled its SSD and put a smaller one in, put ChromeOS Flex on it, and it’s now my wife’s main computer. Fast, simple, totally reliable, and now I have spare Wifi dongle. :-/ I may try on one of my old Thinkpads next.

It is much easier to Hackintosh a PC today than it was 10-12 years ago, but Apple is making the experience less rewarding, as is their right. They are a hardware company.

(Repurposed from a Lobsters comment.)

liam_on_linux: (Default)
The story of why A/UX existed is simple but also strangely sad, IMHO.

Apple wanted to sell to the US military, who are a huge purchaser. At that time, the US military had a policy that they would not purchase any computers which were not POSIX compliant – i.e. they had to run some form of UNIX.

So, Apple did a UNIX for Macs. But Apple being what they are, they did it right – meaning they integrated MacOS into their Unix: it had a Mac GUI, making it the most visually-appealing UNIX of its time by far, and it could network with MacOSs and run (some) MacOS apps.

It was a superb piece of work, technically, but it was a box-ticking exercise: it allowed the military to buy Macs, but in fact, most of them ran MacOS and Mac apps.

For a while, the US Army hosted its web presence on classic MacOS. It wasn't super stable, but it was virtually unhackable: there is no shell to access remotely, however good your 'sploit. There's nothing there.

The irony and the sad thing is that A/UX never got ported to PowerPC. This is at least partly because of the way PowerPC MacOS was done: MacOS was still mostly 68K code and the whole OS ran under an emulator in a nanokernel running underneath it. This would have made A/UX-style interoperability, between a PowerPC-native A/UX and 68K-native MacOS, basically impossible without entirely rewriting MacOS in PowerPC code.

But around the same time that the last release of A/UX came out (3.1.1 in 1995), Apple was frantically scrabbling around for a new, next-gen OS to compete with Win95. If AU/X had run on then-modern – i.e. PowerPC- and PCI-based – Macs by that time, it would have been an obvious candidate. But it didn't and it couldn't.

So Apple spent a lot of time flailing around with Copland and Gershwin and Taligent and OpenDoc, wasted a lot of money, and in the end merged with NeXT.

The irony is that in today's world, spoiled with excellent development tools, everyone has forgotten that late-1980s and early-to-mid 1990s dev tools were awful: 1970s text-mode tools for writing graphical apps.

Apple acquired NeXT because it needed an OS, but what clinched the deal was the development tools (and the return of Jobs, of course.) NeXT had industry-leading dev tools. Doom was written on NeXTs. The WWW was written on NeXTs.

Apple had OS choices – modernise A/UX, or buy BeOS, or buy NeXT, or get bought and move to Solaris or something – but nobody else had Objective-C and Interface Builder, or the NeXT/Sun foundation classes, or anything like them.

The meta-irony being that if Apple had adapted A/UX, or failing that, had acquired Be for BeOS, it would be long dead by now, just a fading memory for middle-aged graphical designers. Without the dev tools, they'd never have got all the existing Mac developers on board, and never got all the cool new apps – no matter how snazzy the OS.

And we'd all be using Vista point 3 by now, and discussing how bad it was on Blackberries and clones...
liam_on_linux: (Default)

Acorn pulled out of making desktop computers in 1998, when it cancelled the Risc PC 2, the Acorn Phoebe.

The machine was complete, but the software wasn't. It was finished and released as RISC OS 4, an upgrade for existing Acorn machines, by RISC OS Ltd.

by that era, ARM had lost the desktop performance battle. If Acorn had switched to laptops by then, I think it could have remained competitive for some years longer -- 486-era PC laptops were pretty dreadful. But the Phoebe shows that what Acorn was actually trying to build was a next-generation powerful desktop workstation.

Tragically, I must concede that they were right to cancel it. If there had been a default version with 2 CPUs, upgradable to 4, and that were followed with 6- and 8-core models, they might have made it, but RISC OS couldn't do that, and Acorn didn't have the resources to rewrite RISC OS to do it. A dedicated Linux machine in 1998 would have been suicidal -- Linux didn't even have a FOSS desktop in those days. If you wanted a desktop Unix workstation, you still bought a Sun or the like.

(I wish I'd bought one of the ATX cases when they were on the market.)

Read more... )
liam_on_linux: (Default)
A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.

But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.

The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.

Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)

The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)

It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.


I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:



Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.

The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.

The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.

The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.

The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.

The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.

The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.

Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.

But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.

The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.

And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.

The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.

The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.

All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!

In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS, which nearly became the next-generation Amiga OS. That could have shaken up the industry -- it was truly radical.

And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX. It didn't happen, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals.

But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.

So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.

But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.

Funny how things turn out.
liam_on_linux: (Default)
They're a bit better in some ways. It's somewhat marginal now.

OK. Position statement up front.

Anyone who works in computers and only knows one platform is clueless. You need cross-platform knowledge and experience to actually be able to assess strengths, weaknesses, etc.

Most people in IT this century only know Windows and have only known Windows. This means that the majority of the IT trade are, by definition, clueless.

There is little real cross-platform experience any more, because so few platforms are left. Today, it's Windows NT or Unix, running on x86 or ARM. 2 families of OS, 2 families of processor. That is not diversity.

So, only olde phartes, yeah like me, who remember the 1970s and 1980s when diversity in computing meant something, have any really useful insight. But the snag with asking olde phartes is we're jaded & curmudgeonly & hate everything.

So, this being so...

The Mac's OS design is better and cleaner, but that's only to the extent of saying New York City's design is better and cleaner than London's. Neither is good, but one is marginally more logical and systematic than the other.

The desktop is much simpler and cleaner and prettier.

App installation and removal is easier and doesn't involve running untrusted binaries from 3rd parties, which is such a hallmark of Windows that Windows-only types think it is normal and natural and do not see if for the howling screaming horror abomination that it actually is. Indeed, put Windows types in front of Linux and they try to download and run binaries and whinge when it doesn't work. See comment about cluelessness above.

(One of the few places where Linux is genuinely ahead -- far ahead -- today is software installation and removal.)

Mac apps are fewer in number but higher in quality.

The Mac tradition of relative simplicity has been merged with the Unix philosophy of "no news is good news". Macs don't tell you when things work. They only warn you when things don't work. This is a huge conceptual difference from the VMS/Windows philosophy, and so, typically, this goes totally unnoticed by Windows types.

Go from a Mac to Windows and what you see is that Windows is constantly nagging you. Update this. Update that. Ooh you've plugged a device in. Ooh, you removed it. Hey it's back but on a different port, I need a new driver. Oh the network's gone. No hang on it's back. Hey, where's the printer? You have a printer! Did you know you have an HP printer? Would you like to buy HP ink?

Macs don't do this. Occasionally it coughs discreetly and asks if you know that something bad happened.

PC users are used to it and filter it out.

Also, PC OSes and apps are all licensed and copy-protected. Everything has to be verified and approved. Macs just trust you, mostly.

Both are reliable, mostly. Both just work now, mostly. Both rarely fail, try to recover fairly gracefully and don't throw cryptic blue-screens at you. That difference is gone.

But because of Windows' terrible design and the mistakes that the marketing lizards made the engineers put in, it's howlingly insecure, and vastly prone to malware. This is because it was implemented badly.

Windows apologists -- see cluelessness -- think it's fine and it's just because it dominates the market. This is because they are clueless and don't know how things should be done. Ignore them. They are loud; some will whine about this. They are wrong but not bright enough to know it. Ignore them.

You need antimalware on Windows. You don't on anything else. Antimalware makes computers slower. So, Windows is slower. Take a Windows PC, nuke it, put Linux on it and it feels a bit quicker.

Only a bit 'cos Linux too is a vile mess of 1970s crap. If it still worked, you could put BeOS on it and discover, holy shit wow lookit that, this thing is really fsckin' fast and powerful, but no modern OS lets you feel it. It's under 5GB of layered legacy crap.

(Another good example was RISC OS. Today, millions of people are playing with Raspberry Pis, a really crappy underpowered £25 tiny computer that runs Linux very poorly. Raspberry Pis have ARM processors. The ARM processor's original native OS, RISC OS, still exists. Put RISC OS on a Raspberry Pi and suddenly it's a very fast, powerful, responsive computer. Swap the memory card for Linux and it crawls like a one-legged dog again. This is the difference between an efficient OS and an inefficient one. The snag is that RISC OS is horribly obsolete now so it's not much use, but it does demonstrate the efficiency of 1980s OSes compared to 1960s/1970s ones with a few decades of crap layered on top.)

Windows can be sort of all right, if you don't expect much, are savvy, careful and smart, and really need some proprietary apps.

If you just want the Interwebs and a bit of fun, it's a waste of time and effort, but Windows people think that there's nothing else (see clueless) and so it survives.

Meanwhile, people are buying smartphones and Chromebooks which are good enough if you haven't drunk the cool-aid.

But really, they're all a bit shit, it's just that Windows is a bit shittier but 99% of computers run it and 99% of computer fettlers don't know anything else.

Once, before Windows NT, but after Unix killed the Real Computers, Unix was the only real game in town for serious workstation users.

Back then, a smart man wrote:

“I liken starting one’s computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.” — Ken Pier, Xerox PARC
That was 30y ago. Now, Windows is like that. Unix is the same but you have air-conditioning and some shots and all the Big Macs you can eat.

It's a horrid vile shitty mess, but basically there's no choice any more. You just get to choose the flavour of shit you will roll in. Some stink slightly less.
liam_on_linux: (Default)
[A chap on a mailing list I'm on talked about being unable to find the "Shutdown" option on Windows 8, and how while he and a friend couldn't work out how to "use Twitter" in over half an hour, his mother worked it out in five minutes.]

I've fallen victim to the "trying to be too clever" PEBCAK error myself, a good few times.
(E.g. I spent ages trying to work out the command to tell my first Apple Newton to shut down. Eventually I consulted the manual. Press the on/off button, it said. I think I actually blushed.)
I tried to learn from it. I don't always win.
Shutdown options are like a "sleep" option on a notebook. You don't need one. Just close the lid.Read more... )

May 2025

S M T W T F S
    12 3
45678910
11121314151617
1819 2021222324
25262728293031

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 11th, 2025 06:53 pm
Powered by Dreamwidth Studios