liam_on_linux: (Default)
I really hate it whenever I see someone calling Apple fans fanboys or attacking Apple products as useless junk that only sells because it's fashionable.

Every hater is 100% as ignorant and wrong as any fanatically-loyal fanboy who won't consider anything else.

Let me try to explain why it's toxic.

If someone/some group are not willing to make the effort to see why a very successful product/family/brand is successful, then it prevents them from learning any lessons from that success. That means that the outgroup is unlikely to ever challenge the success.

In life it is always good to ask why. If this thing is so big, why? If people love it so much, why?

I use a cheap Chinese Android phone. It's my 3rd. I also have a cheap Chinese Android tablet that I almost never use. But last time I bought a phone, I had a Planet Computers Gemini on order, and I didn't want two new ChiPhones, so I bought a used iPhone. This was a calculated decision: the new model iPhones were out and dropped features I wanted. This meant the previous model was now quite cheap.

I still have that iPhone. It's a 6S+. It's the last model I'd want: it has a headphone socket and a physical home button. I like those. It's still updated and last week I put the latest iOS on it.

It allowed me to judge the 2020s iOS ecosystem. It's good. Most of the things I disliked about iOS 6 (the previous iPhone model I had) have been fixed now. Most of the apps can be replaced or customised. It's much more open than it was. The performance is good, the form factor is good, way better than my iPhone 4 was.

I don't use iPhones because I value things like expansion slots, multiple SIMs, standard ports and standard charging cables, and a customisable OS. I don't really use tablets at all.

But my main home desktop computer is an iMac. I am an expert Windows user and maintainer with 35 years' of experience with the platform. I am also a fairly expert Linux user and maintainer with 27 years' experience. I am a full-time Linux professional and have been for nearing a decade... because I am a long-term Windows expert and that is why I choose not to use it any more.

My iMac (2015 Retina 27") is the most gorgeous computer I've ever owned. It looks good, it's a joy to use, it is near silent and trouble-free to a degree that any Windows computer can only aspire to be. I don't need expansion slots and so on: I want the vendor to make a good choice, integrate it well and for it to just work and keep just working, and it does.

It is slim, unobtrusive for a large machine, silent, and the picture (and sound) quality is astounding.

I chose it because I have extensive knowledge of building, specifying, benchmarking, reviewing, fixing, supporting, networking, deploying, and recycling old PCs. It is over 3 decades of expert knowledge of PCs and Windows that is why I spent my own money on a Mac.

So every time someone calls Mac owners fanboys, I know they know less than me and therefore I feel entirely entitled to dump on their ignorance from a great height.

I do not use iDevices. I also do not use Apple laptops. I don't like their keyboards, I don't like their pointing devices, I don't like their hard-to-repair designs. I use old Thinkpads, like most experienced geeks.

But I know why people love them, and if one wishes to pronounce edicts about Apple kit, you had better bloody well know your stuff.

I do not recommend them for everyone. Each person has their own needs and should learn and judge appropriately. But I also do not condemn them out of hand.

I have put in an awful lot of Windows boxes over the years. I have lost large potential jobs when I recommended Windows solutions to Mac houses, because it was the best tool for the job. I have also refused large jobs from people who wanted, say, Windows Server or Exchange Server when it *wasn't* the right tool for the job.

It was my job to assess this stuff.

Which equips me well to know that every single time someone decries Apple stuff, that means that they haven't done the work I have. They don't know and they can't bothered to learn.
liam_on_linux: (Default)
The story of why A/UX existed is simple but also strangely sad, IMHO.

Apple wanted to sell to the US military, who are a huge purchaser. At that time, the US military had a policy that they would not purchase any computers which were not POSIX compliant – i.e. they had to run some form of UNIX.

So, Apple did a UNIX for Macs. But Apple being what they are, they did it right – meaning they integrated MacOS into their Unix: it had a Mac GUI, making it the most visually-appealing UNIX of its time by far, and it could network with MacOSs and run (some) MacOS apps.

It was a superb piece of work, technically, but it was a box-ticking exercise: it allowed the military to buy Macs, but in fact, most of them ran MacOS and Mac apps.

For a while, the US Army hosted its web presence on classic MacOS. It wasn't super stable, but it was virtually unhackable: there is no shell to access remotely, however good your 'sploit. There's nothing there.

The irony and the sad thing is that A/UX never got ported to PowerPC. This is at least partly because of the way PowerPC MacOS was done: MacOS was still mostly 68K code and the whole OS ran under an emulator in a nanokernel running underneath it. This would have made A/UX-style interoperability, between a PowerPC-native A/UX and 68K-native MacOS, basically impossible without entirely rewriting MacOS in PowerPC code.

But around the same time that the last release of A/UX came out (3.1.1 in 1995), Apple was frantically scrabbling around for a new, next-gen OS to compete with Win95. If AU/X had run on then-modern – i.e. PowerPC- and PCI-based – Macs by that time, it would have been an obvious candidate. But it didn't and it couldn't.

So Apple spent a lot of time flailing around with Copland and Gershwin and Taligent and OpenDoc, wasted a lot of money, and in the end merged with NeXT.

The irony is that in today's world, spoiled with excellent development tools, everyone has forgotten that late-1980s and early-to-mid 1990s dev tools were awful: 1970s text-mode tools for writing graphical apps.

Apple acquired NeXT because it needed an OS, but what clinched the deal was the development tools (and the return of Jobs, of course.) NeXT had industry-leading dev tools. Doom was written on NeXTs. The WWW was written on NeXTs.

Apple had OS choices – modernise A/UX, or buy BeOS, or buy NeXT, or get bought and move to Solaris or something – but nobody else had Objective-C and Interface Builder, or the NeXT/Sun foundation classes, or anything like them.

The meta-irony being that if Apple had adapted A/UX, or failing that, had acquired Be for BeOS, it would be long dead by now, just a fading memory for middle-aged graphical designers. Without the dev tools, they'd never have got all the existing Mac developers on board, and never got all the cool new apps – no matter how snazzy the OS.

And we'd all be using Vista point 3 by now, and discussing how bad it was on Blackberries and clones...
liam_on_linux: (Default)

I was a huge Archimedes fan and still have an A310, an A5000, a RiscPC and a RasPi running RISC OS.

But no, I have to disagree. RISC OS was a hastily-done rescue effort after Acorn PARC failed to make ARX work well enough. I helped to arrange this talk by the project lead a few years ago.

RISC OS is a lovely little OS and a joy to use, but it's not very stable. It has no worthwhile memory protection, no virtual memory, no multi-processor support, and true preemptive multitasking is a sort of bolted-on extra (the Task Window). When someone tried to add pre-emption, it broke a lot of existing apps.

It was not some industry-changing work of excellence that would have disrupted everything. It was just barely good enough. Even after 33 years, it doesn't have wifi or bluetooth support, for instance, and although efforts are going on to add multi-processor support, it's a huge amount of work for little gain. There are a whole bunch of memory size limits in RISC OS as it is -- apps using >512MB RAM are very difficult and that requires hackery.

IMHO what Acorn should have done is refocus on laptops for a while -- they could have made world-beating thin, light, long-life, passively-cooled laptops in the late 1990s. Meanwhile, worked with Be on BeOS for a multiprocessor Risc PC 2. I elaborated on that here on this blog.

But RISC OS was already a limitation by 1996 when NT4 came out.

I've learned from Reddit that David Braben (author of Elite and the Archimedes' stunning "Lander" demo and Zarch game) offered to add enhancements to BBC BASIC to make it easier to write games. Acorn declined. Apparently, Sony was also interested in licensing the ARM and RISC OS for a games console -- probably the PS1 -- but Acorn declined. I had no idea. I thought the only 3rd party uses of RISC OS were NCs and STBs. Acorn's platform was, at the time, almost uniquely suitable for this -- a useful Internet client on a diskless machine.

The interesting question, perhaps, is the balance between pragmatic minimalism as opposed to wilful small-mindedness.

I really recommend the Chaos Computer Congress Ultimate Archimedes talk on this subject.

There's a bunch of stuff in the original ARM2/IOC/VIDC/MEMC design (e.g. no DMA, e.g. the 26-bit Program Counter register) that looks odd but reflects pragmatic decisions about simplicity and cost above all else... but a bit like the Amiga design, one year's inspired design decision may turn out, a few years later, to be a horrible millstone around the team's neck. Even the cacheless design which was carefully tuned to the access speeds of mid-1990s FP-mode DRAM.

They achieved greatness by leaving a lot out -- but not just from some sense of conceptual purity. Acorn's Steve Furber said it best: "Acorn gave us two things that nobody else had. No people and no money."

Acorn implemented their new computer on four small, super-simple, chips and a minimalist design, not because they wanted to, but because it was a design team of about a dozen people and almost no budget. They found elegant work-arounds and came up with a clever design because that's all they could do.

I think it may not be a coincidence that a design that was based on COTS parts and components, assembled into an expensive, limited whole eventually evolved into the backbone of the entire computer industry. It was poorly integrated but that meant that parts could be removed and replaced without breaking the whole: the CPU, the display, the storage subsystems, the memory subsystem, in the end the entire motherboard logic and expansion bus.

I refer, of course, to the IBM PC design. It was poor then, but now it's the state of the art. All the better-integrated designs with better CPUs are gone, all the tiny OSes with amazing performance and abilities in a tiny space are gone.

When someone added proper pre-emptive multitasking to RISC OS, it could no longer run most existing apps. If CBM had added 68030 memory management to AmigaOS, it would have broken inter-app communication.

Actually, the much-maligned Atari ST's TOS got further, with each module re-implemented by different teams in order to give it better display support, multitasking etc. while remaining compatible. TOS became MINT -- Mint Is Not TOS -- and then MINT became TOS 4. It also became the proprietary MaGiC OS-in-a-VM for Mac and PC, and later, volunteers integrated 3rd party modules to create a fully GPL edition, AFROS.

But it doesn't take full advantage of later CPUs and so on -- partly because Atari didn't.
Apple famously tried to improve MacOS into something with proper multitasking, nearly went bankrupt doing so, bought their co-founder's company NeXT and ended up totally dumping their own OS, frameworks, APIs and tooling -- and most of the developers -- and switching to a UNIX.

Sony could doubtless have done wonderful stuff with RISC OS on a games console -- but note that the Playstation 4 runs Orbis, which is based on FreeBSD 9, but none of Sony's improvements have made it back to FreeBSD.

Apple macOS is also in part based on FreeBSD, and none of its improvements have made it back upstream. macOS has a better init system, launchd, and a networked metadata directory, netinfo, and a fantastic PDF-based display server, Quartz, as well as some radical filesystem tech.
You won't find any of that in FreeBSD. It may have some driver stuff but the PC version is the same ugly old UNIX OS.

If Acorn made its BASIC into a games engine, that would have reduced its legitimacy in the sciences market. Gamers don't buy expensive kit, universities and laboratories do. Games consoles sell at a loss, like inkjet printers -- the makers earn a profit on the games or ink cartridges. It's called the Gilette razors model.

As a keen user, it greatly saddened me when Acorn closed down its workstations division, but the OS was by then a huge handicap, and there simply wasn't an available replacement by then. As I noted in that blog post I linked to, they could have done attractive laptops, but it wouldn't have helped workstation sales, not back then.

The Phoebe, the cancelled RISC PC 2, had PCI and dual-processor support. Acorn could have sold SMP PCs way cheaper than any x86 vendor, for most of whom the CPU was the single most expensive component. But it wasn't an option, because RISC OS couldn't use 2 CPUs and still can't. If they'd licensed BeOS, and maybe saved Be, who knows -- a decade as the world's leading vendor of inexpensive multiprocessor workstations doesn't sound so bad -- well, the resultant machines would have been very nice, but they wouldn't be RISC PCs because they wouldn't run Archimedes apps, and in 1998 the overheads of running RISC OS in a VM would have been prohibitive. Apple made it work, but some 5 years later, when it was normal for a desktop Mac to come with 128MB or 256MB of RAM and a few gigs of disk, and it was doable to load a 32-64MB VM with another few hundred megs of legacy OS in it. That was rather less true in 1997 or 1998, when a high-end PC had 32 or 64MB of RAM, a gig of disk, and could only take a single CPU running at a couple of hundred megahertz.

I reckon Acorn and Be could have done it -- BeOS was tiny and fast, RISC OS was positively minute and blisteringly fast -- but whether they could have done it in time to save them both is much more doubtful.
I'd love to have seen it. I think there was a niche there. I'm a huge admirer of Neal Stephenson and his seminal essay In The Beginning Was The Command Line is essential reading. It dissects some of the reasons Unix is the way it is and accurately depicts Linux as the marvel it was around the turn of the century. He lauds BeOS, and rightly so. Few ever saw it but it was breathtaking at the time.

Amiga fans loved their machine, not only for its graphics and sound, but multitasking too. This rather cheesy 1987 video does show why...


Just a couple of years later, the Archimedes did pretty much all that and more and it did it with raw CPU grunt, not fancy chips. There are reasons its OS is still alive and still in use. Now, it runs on a mass-market £25 computer. AmigaOS is still around, but all the old apps only run under emulation and it runs on niche kit that costs 5-10x more than a PC of comparable spec.

A decade later, PCs had taken over and were stale and boring. Sluggish and unresponsive despite their immense power. Acorn computers weren't, but x86 PCs were by then significantly more powerful, had true preemptive multitasking, built-in networking and WWW capabilities and so on. But no pizazz. They chugged. They were boring office kit, and they felt like it.

But take a vanilla PC and put BeOS on it, and suddenly, it booted in seconds, ran dozens of apps with ease without flicker or hesitation, played back multiple video streams while rendering them onto OpenGL 3D solids. And, like the Archimedes did a decade before, all in software, without hardware acceleration. All the Amiga's "wow factor" long after we'd given up ever seeing it again.

This, at the time when Linux hadn't even got a free desktop GUI yet, required hand-tuning thousands of lines of config files like OS/2 at its worst, and had no productivity apps.

But would this have been enough to keep A&B going until mass-market multi-core x86 chips came along and stomped them? Honestly, I really doubt it. If Apple had bought Be, it would have got a lovely next-gen OS, but it wouldn't have got Steve Jobs, and it wouldn't have been able to tempt classic MacOS devs to the new OS with amazing next-gen dev tools. I reckon it would have died not long after.

If Acorn and Be had done a deal, or merged or whatever, would there have been enough appeal in the cheapest dual-processor RISC workstation, with amazing media abilities, in the industry? (Presumably, soon after, quad-CPU and even 6- or 8- CPU boxes.)

I hate to admit it, but I really doubt it.
liam_on_linux: (Default)
So in a thread on CIX, someone was saying that the Sinclair computers were irritating and annoying, cut down too far, cheap and slow and unreliable.

That sort of comment still kinda burns after all these decades.

I was a Sinclair owner. I loved my Spectrums, spent a lot of time and money on them, and still have 2 working ones today.

Yes, they had their faults, but for all those who sneered and snarked at their cheapness and perceived nastiness, *that was their selling point*.

They were working, usable, useful home computers that were affordable.

They were transformative machines, transforming people, lives, economies.

I had a Spectrum not because I massively wanted a Spectrum -- I would have rather had a BBC Micro, for instance -- but because I could afford a Spectrum. Well, my parents could, just barely. A used one.

My 2nd, 3rd and 4th ones were used, as well, because I could just about afford them.

If all that had been available were proper, serious, real computers -- Apples, Acorns, even early Commodores -- I might never have got one. My entire career would never have happened.

A BBC Micro was pushing £350. My used 48K Spectrum was £80.

One of those is doable for what parents probably worried was a kid's toy that might never be used for anything productive. The other was the cost of a car.
ExpandRead more... )
liam_on_linux: (Default)
I recently read that a friend of mine claimed that "Both the iPhone and iPod were copied from other manufacturers, to a large extent."

This is a risible claim, AFAICS.

There were pocket MP3 jukeboxes before the iPod. I still own one. They were fairly tragic efforts.

There were smartphones before the iPhone. I still have at least one of them, too. Again, really tragic from a human-computer interaction point of view.


AIUI, the iPhone originated internally as a shrunk-down tablet. The tablet originated from a personal comment from Bill Gates to Steve Jobs that although tablets were a great idea, people simply didn’t want tablets because Microsoft had made them and they didn’t sell.
ExpandRead more... )
Jobs’ response was that the Microsoft ones didn’t sell because they were no good, not because people didn’t want tablets. In particular, Jobs stated that using a stylus was a bad idea. (This is also a pointer was to why he cancelled the Newton. And guess what? I've got one of them, too.)

Gates, naturally, contested this, and Jobs started an internal project to prove him wrong: a stylus-free finger-operated slim light tablet. However, when it was getting to prototype form, he allegedly realised, with remarkable prescience, that the market wasn’t ready yet, and that people needed a first step — a smaller, lighter, simpler, pocketable device, based on the finger-operated tablet.

Looking for a role or function for such a device, the company came up with the idea of a smartphone.

Smartphones certainly existed, but they were a geek toy, nothing more.

Apple was bold enough to make a move that would kill its most profitable line — the iPod — with a new product. Few would be so bold.

I can’t think of any other company that would have been bold enough to invent the iPhone. We might have got to devices as capable as modern smartphones and tablets, but I suspect they’d have still been festooned in buttons and a lot clumsier to use.

It’s the GUI story again. Xerox sponsored the invention and original development but didn’t know WTF to do with it. Contrary to the popular history, it did productise it, but as a vastly expensive specialist tool. It took Apple to make it the standard method of HCI, and it took Apple two goes and many years. The Lisa was still too fancy and expensive, and the original Mac too cut-down and too small and compromised.

The many rivals’ efforts were, in hindsight, almost embarrassingly bad. IBM’s TopView was a pioneering GUI and it was rubbish. Windows 1 and 2 were rubbish. OS/2 1.x was rubbish, and to be honest, OS/2 2.x was the pre-iPhone smartphone of GUI OSes: very capable, but horribly complex and fiddly.

Actually, arguably — and demonstrably, from the Atari ST market — DR GEM was a far better GUI than Windows 1 or 2. GEM was a rip-off of the Mac; the PC version got sued and crippled as a result, so blatant was it. It took MS over a decade to learn from the Mac (and GEM) and produce the first version of Windows with a GUI good enough to rival the Mac’s, while being different enough not to get sued: Windows 95.

Now, 2 decades later, everyone’s GUI borrows from Win95. Linux is still struggling to move on from Win95-like desktops, and even Mac OS X, based on a product which inspired Win95, borrows some elements from the Win95 GUI.

Everyone copies MS, and MS copies Apple. Apple takes bleeding-edge tech and turns geek toys into products that the masses actually want to buy.

Microsoft’s success is founded on the IBM PC, and that was IBM’s response to the Apple ][.

Apple has been doing this consistently for about 40 years. It often takes it 2 or 3 goes, but it does.

  • First time: 8-bit home micros (the Apple ][, an improved version of a DIY kit.)

  • Second time: GUIs (first the Lisa, then the Mac).

  • Third time: USB (on the iMac, arguably the first general-purpose PC designed and sold for Internet access as its primary function).

  • Fourth time: digital music players (the iPod wasn’t even the first with a hard disk).

  • Fifth time: desktop Unix (OS X, based on NeXTstep).

  • Sixth time: smartphones (based on what became the iPad, remember).

  • Seventh time: tablets (the iPad, actually progenitor of the iPhone rather than the other way round).

Yes, there are too many Mac fans, and they’re often under-informed. But there are also far to many Microsoft apologists, and too many Linux ones, too.

I use an Apple desktop, partly because with a desktop, I can choose my own keyboard and pointing device. I hate modern Apple ones.

I don’t use Apple laptops or phones. I’ve owned multiple examples of both. I prefer the rivals.

My whole career has been largely propelled by Microsoft products. I still use some, although my laptops run Linux, which I much prefer.

I am not a fanboy of any of them, but sadly, anyone who expresses fondness or admiration for anything Apple will be inevitably branded as one by the Anti-Apple fanboys, whose ardent advocacy is just as strong and just as irrational.

As will this.
liam_on_linux: (Default)
A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.

But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.

The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.

Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)

The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)

It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.


I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:



Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.

The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.

The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.

The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.

The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.

The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.

The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.

Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.

But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.

The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.

And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.

The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.

The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.

All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!

In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS, which nearly became the next-generation Amiga OS. That could have shaken up the industry -- it was truly radical.

And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX. It didn't happen, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals.

But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.

So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.

But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.

Funny how things turn out.
liam_on_linux: (Default)
(Repurposed CIX post.)

Don’t get me wrong. I like Apple kit. I am typing right now on an original 1990 Apple Extended II keyboard, attached via a ABD-USB convertor to a Core i5 Mac mini from 2011, running Mac OS X 10.10. It’s a very pleasant computer to work on.

But, to give an example of the issues — I also have an iPhone. It’s my spare smartphone with my old UK SIM in it.

But it’s an iPhone 4. Not a lot of RAM, under clocked CPU, and of course not upgradable.

So I’ve kept it on iOS 6, because I already find it annoyingly slow and iOS 7 would cause a reported 15-25% or more slowdown. And that’s the latest it will run.

Which means that [a] I can’t use lots of iPhone apps as they no longer support iOS 6.x and [b] it doesn’t do any of the cool integration with my Mac, because my Mac needs a phone running iOS 8 to do clever CTI stuff.

My old 3GS I upgraded from iOS 4 to 5 to 6, and regretted it. It got slower & slower and Apple being Apple, *you can’t go back*.

Apple kit is computers simplified for non-computery people. Stuff you take for granted with COTS PC kit just can’t be done. Not everything — since the G3 era, they take ordinary generic RAM, hard disks, optical drives, etc. Graphics cards etc. can often be made to work; you can, with work, replace CPUs and runs OSes too modern to be supported.

But it takes work. If you don’t want that, if you just max out the RAM, put a big disk in and live with it, then it’s fine. I’m old enough that I want a main computer that Just Works and gives me no grief and the Mac is all that and it cost me under £150, used. The OS is of course freeware and so are almost all the apps I run — mostly FOSS.

I like FOSS software. I use Firefox, Adium, Thunderbird, LibreOffice, Calibre, VirtualBox and BOINC. I also have some closed-source freeware like Chrome, Dropbox, TextWrangler and Skype. I don’t use Apple’s browser, email client, chat client, text editor, productivity apps or anything. More or less only iTunes, really.

What this means is that I can use pretty much the same suite of apps on Linux, Mac and Windows, making switching between them seamless and painless. My main phone runs Android, my travelling laptop is a 2nd-hand Thinkpad with the latest Ubuntu LTS on it.

As such, many of the benefits of an all-Apple solution are not available to me — texting and making phone calls from the desktop, seamless handover of file editing from desktop to laptop to tablet, wireless transparent media sync between computers and phone, etc.

I choose not to use any of this stuff because I don’t trust closed file formats and dislike vendor lock-in.

Additionally, I don’t like Apple’s modern keyboards and trackpads, and I like portable devices where I can change the battery or upgrade the storage. So I don’t use Apple laptops and phones and don’t own a tablet. iPads are just big iPhones and I don’t like iPhones much anyway. The apps are too constrained, I hate typing on a touchscreen “keyboard” and I don’t like reading book-length texts from a brightly-glowing screen — I have a large-screen (A4) Kindle for ebooks. (Used off eBay, natch.) TBH I’d quite like a backlight on it but the big-screen model doesn’t offer one.

But I don’t get that with Ubuntu. I never used UbuntuOne; I don’t buy digital content at all, from anyone; my Apple account is around 20 years old and has no payment method set up on it. I have no lock-in to Apple and Ubuntu doesn’t try to foist it on me.

With Ubuntu, *I* choose the laptop and I can (and did) build my own desktops, or more often, use salvaged freebies. My choice of keyboard and mouse, etc. I mean, sure, the Retina iMac is lovely, but it costs more than I’m willing to spend on a computer.

Android is… all right. It’s flakey but it’s cheap, customisable (I’ve replaced web browser, keyboard, launcher and email app, something Apple does not readily permit without drastic limitations) and it works well enough.

But it’s got bloatware, tons of vendor-specific extensions and it’s not quick.

Ubuntu is sleek as Linuxes go. I like the desktop. I turn off the web ads and choose my own default apps and it’s perfectly happy to let me. I can remove the built-in ones if I want and it doesn’t break anything.

If I could get a phone that ran Ubuntu, I’d be very interested. And it might tempt me into buying a tablet.

I’ve tried all the leading Linuxes (and most of the minor ones) and so long as you’re happy with its desktop, Ubuntu is the best by a country mile. It’s the most polished, best-integrated, it works well out of the box. I more or less trust them, as much as I trust any software vendor.

The Ubuntu touch offerings look good — the UI works well, the apps look promising, and they have a very good case for the same apps working well on phone and tablet, and the tablet becoming a usable desktop if you just plug a mouse in.

Here’s a rather nice little 3min demo:
https://www.youtube.com/watch?v=c3PUYoa1c9M

Wireless mouse turned on: desktop mode, windows, title bars, menus, etc.
Turn it off, mid-session: it’s a tablet, with touch controls. *With all the same same apps and docs still open.*
Mouse back on: it’s in desktop mode again.

And there’s integration — e.g. phone apps run full-size in a sidebar on a tablet screen, visible side-by-side with tablet apps.

Microsoft doesn’t have this, Apple doesn’t, Google doesn’t.

It looks promising, it runs on COTS hardware and it’s FOSS. What’s not to like?

I suspect, when the whole plan comes together, that they will have a compelling desktop OS, a compelling phone OS and a compelling tablet OS, all working very well together but without any lock-in. That sounds good to me and far preferable to shelling out thousands on new kit to achieve the same on Apple’s platform. Because C21 Apple is all about selling you hardware — new, and regularly replaced, too — and then selling you digital content to consume on it.

Ubuntu isn’t. Ubuntu’s original mission was to bring Linux up to the levels of ease and polish of commercial OSes.

It’s done that.

Sadly, the world failed to beat a path to its door. It’s the leading Linux and it’s expanded the Linux market a little, but Apple beat it to market with a Unix that is easier, prettier and friendlier than Windows — and if you’re willing to pay for it, Apple makes nicer hardware too.

But now we’re hurtling into the post-desktop era. Apple is leading the way; Steve Jobs finally proved his point that he knew how to make a tablet that people wanted and Bill Gates didn’t. Gates’ company still doesn’t, even when it tries to embrace and extend the iPad type of device: millions of the original Surface tablets are destined for landfill like the Atari ET game and Apple Lisa. (N.B. *not* the totally different Surface Pro, but people use it as a lightweight laptop.)

But Apple isn’t trying to make its touch devices replace desktops and laptops — it wants to sell both.

Ubuntu doesn’t sell hardware at all. So it’s trying to drag proper all-FOSS Linux kicking and screaming into the twenty-twenties: touch-driven *and* by desk-bound hardware-I/O, equally happy on ARM or x86-64, very shiny but still FOSS underneath.

The other big Linux vendors don’t even understand what it’s trying to do. SUSE does Linux servers for Microsoft shops; Red Hat sells millions of support contracts for VMs in expensive private clouds. Both are happy doing what they’re doing.

Whereas Shuttleworth is spending his millions trying to bring FOSS to the masses.

OK, what Elon Musk is doing is much much cooler, but Shuttleworth’s efforts are not trivial.

July 2025

S M T W T F S
  1234 5
6789101112
13141516171819
20212223242526
2728293031  

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

Expand All Cut TagsCollapse All Cut Tags
Page generated Jul. 6th, 2025 04:01 am
Powered by Dreamwidth Studios