liam_on_linux: (Default)
A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.

But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.

The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.

Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)

The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)

It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.


I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:



Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.

The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.

The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.

The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.

The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.

The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.

The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.

Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.

But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.

The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.

And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.

The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.

The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.

All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!

In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS, which nearly became the next-generation Amiga OS. That could have shaken up the industry -- it was truly radical.

And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX. It didn't happen, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals.

But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.

So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.

But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.

Funny how things turn out.
liam_on_linux: (Default)
I recently received an email from a reader -- a rare event in itself -- following my recent Reg article about educational OSes.

They asked for more info about the OS. So, since there's not a lot of this about, here is some more info about the Oberon programming language, the Oberon operating system written in it, and the modern GUI version, Bluebottle.

It is the final act in the life's work of Professor Niklaus Wirth, inventor of Pascal and later Modula-2. Oberon is what Pascal evolved into; probably, he should have called them all Pascal:

  1. Pascal 1 (i.e. Pascal & Delphi)

  2. Modula

  3. Modula-2 (basis of the original Acorn Archimedes OS, among others)

  4. Oberon

IgnoreTheCode has a good overview. This is perhaps the best place to start for a high-level quick read.

The homepage for the FPGA OberonStation went down for a while. Perhaps it was the interest driven by my article. ;-)

It is back up again now, though.

Perhaps the seminal academic paper is Oberon - the Overlooked Jewel by Michael Franz of the University of California at Irvine.

A PDF is here: https://pdfs.semanticscholar.org/d48b/ecdaf5c3d962e2778f804e...

This is essential reading to understand its relevance in computer science.

There are 2 software projects called "Oberon", a programming language and an operating system, or family of OSes, written in the language.

There's some basic info on Wikipedia about both the OS and the programming language.

Professor Wirth worked at ETH Zurich, which has a microsite about the Oberon project. However, this has many broken links and is unmaintained.

And the Oberon Book, the official bible of the project, is online.

Development did not stop on the OS after Prof Wirth retired. It continued and became AOS, which has a rather different type of GUI called a Zooming UI. The AOS zooming UI is called "Bluebottle" and newer versions of the OS are thus referred to as "A2", "Bluebottle" (or both, as "AOS" is a widely-used name).

There is a sort of fan page dedicated to A2/Bluebottle.

Here's the OS project on GitHub.

There is a native port for x86 PCs. I have this running under VirtualBox, as an app under 64-bit Linux, and natively on the metal of a Thinkpad X200.
liam_on_linux: (Default)
So a regular long-term member of one of the Ubuntu lists is saying that they don't trust Google to respect their privacy. This from someone who runs Opera 12 (on Ubuntu with Unity) because they had not noticed it had been updated... for three years.

I realise that I could have put this better, but...

As is my wont, I offered one of my favourite quotes:

Scott McNeally, CEO and co-founder of Sun Microsystems, said it best.

He was put on a panel on internet security and privacy, about 20y ago.

Eventually, they asked the silent McNeally to say something.

He replied:

"You have no privacy on the Internet. Get over it."

He was right then and he's right now. It's a public place. It's what it's for. Communication, sharing. Deal with it.

Run current software, follow best-practice guidelines from the like of SwiftOnSecurity on Twitter, but don't be obsessive about it, because it is totally pointless.

You CANNOT keep everything you do private and secure and also use the 21st century's greatest communications tool.

So you choose. Use the Internet, and stop panicking, or get off it and stay off it.

Your choice.

Modern OSes and apps do "phone home" about what you're doing, yes, sure.

This does not make them spyware.

http://www.zdnet.com/article/revealed-the-crucial-detail-that-windows-10-privacy-critics-are-missing/?tag=nl.e539&s_cid=e539&ttag=e539&ftag=TRE17cfd61

You want better software? You want things that are more reliable, more helpful, more informative?

Yes?

Then stop complaining and get on with life.

No? You want something secure, private, that you can trust, that you know will not report anything to anyone?

Then go flash some open-source firmware onto an old Thinkpad and run OpenBSD on it.

There are ways of doing this, but they are hard, they are a lot more work, and you will have a significantly degraded experience with a lot of very handy facilities lost.

That is the price of privacy.

And, listen, I am sorry if this is not what you want to hear, but if you are not technically literate enough to notice that you're running a browser that has been out of date for 3 years, then I think that you are not currently capable of running a really secure environment. I am not being gratuitously rude here! I am merely pointing out facts that others will be too nervous to do.

You cannot run a mass-market OS like Windows 10, Mac OS X or Ubuntu with Unity and have a totally secure private computer.

You can't. End of. It's over. These are not privacy-oriented platforms.

They do exist. Look at OpenBSD. Look at Qubes OS.

But they are hard work and need immense technical skill -- more than I have, for instance, after doing this stuff for a living for nearly 30y. And even then, you get a much poorer experience, like a faster 1980s computer or something.

As it is, after being on my CIX address for 25 years and my Gmail address for 12, all my email goes through Gmail now -- the old address, the Hotmail and Yahoo spamtraps, all of them. I get all my email, contacts and diary, all in one place, on my Mac and on both my Linux laptops and on both my Android and Blackberry smartphones. It's wonderful. Convenient, friendly, powerful, free, cross-platform and based on FOSS and compatible with FOSS tools.

But it means I must trust Google to store everything.

I am willing to pay that price, for such powerful tools for no money.

I am a trained Microsoft Exchange admin. I could do similar with Office 365, but I've used it, and it's less cross-platform, it's less reliable, it's slower, the native client tools are vastly inferior and it costs money.

Nothing much else could do this unless I hosted my own, which I am technically competent to do but would involve a huge amount of work, spending money and still trusting my hosting provider.

You have a simple choice. Power and convenience and ease, or, learning a lot more tech skills and privacy but also inconvenience, loss of flexibility and capability and simplicity.

You run a closed-source commercial browser on what [another poster] correctly points out is the least-private Linux distro that there is.

You have already made the choice.

So please, stop complaining about it. You chose. You are free to change your mind, but if you do, off to OpenBSD you go. Better start learning shell script and building from source.
liam_on_linux: (Default)
Both are niche today. Conceded, yes, but… and it’s a big “but”…

It depends on 2 things: how you look at it, & possible changes in circumstances.

Linux *on the desktop* is niche, sure. But that’s because of the kind of desktop/laptop usage roles techies see.

In other niches:

http://www.cnbc.com/2015/12/03/googles-chromebooks-make-up-half-of-us-classroom-devices.html

The URL explains the main story: 51% of American classroom computers are Chromebooks now. That’s a lot, and that’s 100% Linux.

And it’s happened quite quickly (in under 3y), without noise or fuss, without anyone paying a lot of attention. That’s how these changes often happen: under the radar, unnoticeably until suddenly you wake up & it’s all different.

In servers, it utterly dominates. On pocket smart devices, it utterly dominates.

But look at conventional adults’ desktops and laptops, no, it’s nowhere, it’s niche.

So, for now, on the road as private vehicles, e-cars are a small niche, yes.

But, in some role we’re not thinking about — public transport, or taxis, or something other than private cars — they might quietly gain the edge and take over without us noticing, as Chromebooks are doing in some niches.

The result, of course, is that they’re suddenly “legitimised” — there’s widespread knowledge, support, tooling, whatever and suddenly changes in some other niche mean that they’re a lot more viable for private cars.

For years, I ran the fastest computer I could afford. Often that was for very little money, because in the UK I was poor for a long time. I built and fixed and bodged. My last box was a honking big quad-core with 8GB of RAM (from Freecycle) with a dual-head 3D card (a friend’s cast-off) and lots of extras.

Then I sold, gave or threw away or boxed up most of my stuff, came over here, and had money but less space and less need to bodge. So I bought a friend’s old Mac mini. I’m typing on it now, on a 25y old Apple keyboard via a converter.

It’s tiny, silent except when running video or doing SETI, and being a Mac takes no setup or maintenance. So much less work than my Hackintosh was.

Things change, and suddenly an inconceivable solution is the sensible or obvious one. I don’t game much — very occasional bit of Portal - so I don’t need a GPU. I don’t need massive speed so a Core i5 is plenty. I don’t need removable media any more, or upgradability, or expandability.

Currently, people buy cars like my monster Hackintosh: used, cheap, but big, spacious, powerful, with lots of space in ‘em, equally capable of going to the shops or taking them to the other end of the country — or a few countries away. Why? Well that’s because most cars are just like that. It’s normal. It doesn’t cost anything significant.

But in PCs, that’s going away. People seem to like laptops and NUCs and net-tops and Chromebooks and so on: tiny, no expansion slots, often no optical media, not even ExpressCard slots or the like any more — which were standard a decade or 2 ago. With fast external serial buses, we don’t need them any more.

Big bulky PCs are being replaced by small, quiet, almost-unexpandable ones. Apple is as ever ahead of the trade: it doesn’t offer any machines with expansion slots at all any more. You get notebooks, iMacs, Mac minis or the slotless built-around-its-cooling PowerMac, incapable of even housing a spinning hard disk.

Why? When they’re this bloody fast anyway, only hobbyist dabblers change CPUs or GPUs. Everyone else uses it ’till it dies then replaces it.

Cars may well follow. Most only do urban cycle motoring: work, shops, occasional trip to the seaside or something. Contemporary electric cars do that fine and they’re vastly cheaper to run. And many don’t need ‘em daily so use car clubs such as Zipcar etc.

Perhaps the occasional longer trips will be taken up by some kind of cheap rentals, or pooling, or something unforeseen.

But it’s a profound error of thinking to write them off as being not ready yet, or lacking infrastructure, or not viable. They are, right now, and they are creeping in.

We are not so very far from the decline and fall of Windows and the PC. It might not happen, but with Mac OS X and Chromebooks and smarter tablets and convertibles and so on, the enemies are closing in. Not at the gate yet, but camped all around.

Electric vehicles aren’t quite there yet but they’re closer than the comments in this thread — entirely typically for the CIX community — seem to think.
liam_on_linux: (Default)
Not to blow my own horn, but, er, well, OK, yes I am. Tootle-toot.

I'm back writing for the Register again, in case you hadn't noticed. A piece a month for the last 2 months.

You might enjoy these:

Old, not obsolete: IBM takes Linux mainframes back to the future
Your KVMs, give them to me
http://www.theregister.co.uk/2015/11/02/ibm_linux_mainframes/

From Zero to hero: Why mini 'puter Oberon should grab Pi's crown
It's more kid-friendly... No really
http://www.theregister.co.uk/2015/12/02/pi_versus_oberton/
liam_on_linux: (Default)
As far as I can recall, I didn't plug this at the time, as this series of 5 articles for the Register was later collated into my first-ever book - a Kindle ebook of the same title: http://bit.ly/trabhov

However, that was about 4 years ago now and as it's one of the few times in my tech career that I have accurately predicted a future technology trend -- i.e., containers -- I think it's time.

You can buy the ebook here if you would like to support my work -- no, I don't get royalties, but it will endear me to the Reg:

http://www.theregister.co.uk/2011/12/19/short_history_of_virtualisation/

Or, if you're a cheapskate and just want to read the content for free, then here are the component articles:

http://www.theregister.co.uk/Print/2011/07/11/a_brief_history_of_virtualisation_part_one/

http://www.theregister.co.uk/Print/2011/07/14/brief_history_of_virtualisation_part_2/

http://www.theregister.co.uk/Print/2011/07/18/brief_history_of_virtualisation_part_3/

http://www.theregister.co.uk/Print/2011/07/21/brief_history_of_virtualisation_part_4/

http://www.theregister.co.uk/Print/2011/07/25/brief_history_of_virtualisation_part_5/

(Part 3 is the one about containers)

Enjoy. Buy a copy for all your friend, it's the ideal holiday gift!
liam_on_linux: (Default)
(The title is a parody of http://www.dreamsongs.com/WIB.html )

Even today, people still rail against the horrors of BASIC, as per Edsger Dijkstra's famous comment about it brain-damaging beginner programmers beyond any hope of redemption:

https://reprog.wordpress.com/2010/03/09/where-dijkstra-went-wrong-the-value-of-basic-as-a-first-programming-language/

I rather feel that this is due to perceptions of some of the really crap early 8-bit BASICs, and wouldn't have applied if students learned, say, BBC BASIC or one of the other better dialects.

For example, Commodore's pathetically-limited BASIC as supplied on the most successful home computer ever, the Commodore 64, in 1982. Despite its horrors, it's remembered fondly by many. There's even a modern FOSS re-implementation of it!

https://github.com/mist64/cbmbasic

I've long been puzzled as to exactly why the Commodore 64 shipped with such a terrible, limited, primitive BASIC in its ROM: CBM BASIC 2.0, essentially the 6502 version of Microsoft's MS-BASIC. It wasn't done for space reasons -- the original Microsoft BASIC fitted into 4kB of ROM and a later version into 8kB:

http://www.emsps.com/oldtools/msbasv.htm

Acorn's BBC BASIC (first released a year earlier, in 1981) was a vastly better dialect.

AFAIK all the ROMable versions of BBC BASIC (BASIC I to BASIC 4.62) fitted into a 16kB ROM, so in terms of space, it was doable.

http://mdfs.net/Software/BBCBasic/Versions

IOW, CBM had enough room; the C64 kernal+BASIC were essentially those of the original PET, and fitted into an 8kB ROM, I think. And the C64 shipped after the B and P series machines, the CBM-II. OK, CBM BASIC 4 wasn’t much of an improvement, but it was better.

Looking back years later, and reading stuff like Cameron Kaiser’s “Secret Weapons of Commodore” site:

http://www.floodgap.com/retrobits/ckb/secret/

… it seems to me that Commodore management never really had much of an idea of what they were doing. Unlike companies such as Sinclair or Acorn, labouring for years over tiny numbers of finely-honed models, in the 8-bit era, Commodore had multiple teams designing dozens of models of all sorts of kit, often conflicting with one another, and just occasionally chose to ship certain products and kill others — sometimes early, sometimes when it was nearly ready and the packaging was being designed.

(Apple was similar, but at a smaller scale — e.g. the Apple /// competing with the later Apple ][ machines, and the Mac competing with the Lisa, and then the Apple ][GS competing with the Mac.)

There were lovely devices that might have thrived, such as the C65, which were killed.

There were weird, mostly inexplicable hacked-together things, such as the C128, a bastard of a C64, plus a slightly-upgraded C64, plus, of all things, a CP/M micro based around an entirely different an totally incompatible processor, so the C128 had two: a 6502 derivative and a Z80. Bizarre.

There were determined efforts to enhance product lines whose times were past, such as the CBM-II machines, an enhanced PET when the IBM PC was already taking over.

There were odd half-assed efforts to fix problems with released products, such as the C16 and Plus-4, which clearly showed that management didn’t understand their own successes: the C64 was an wildly-successful upgrade of the popular VIC-20, but rather than learn from that and do it again, Commodore did something totally different and incompatible, launched with some fanfare, and appeared mystified that it bombed.

It’s a very strange story of a very schizophrenic company.

And of course, rather than develop their own successor for the 16-bit era, they bought it in — the Lorraine, later the Amiga, a spiritual successor to the Atari 8-bit machines, which themselves were inspired kit for their time.

This leaving Atari in the lurch, but to which the company responded in an inspired way with the ST: an clever mixture of off-the-shelf parts -- PC-type where that was good enough (e.g. graphics controller), or from the previous generation of 8-bits (e.g. sound chip), plus a bought-in adapted OS (Digital Research's GEMDOS plus GEM, never crippled like the PC version was due to Apple's lawsuit, meaning PC disk formats and file compatibility. And of course the brilliant inclusion of MIDI ports, foreseeing an entire industry that was around the corner.

The ST is what the Sinclair QL should have been: a cheap, affordable, usable 16-bit computer. Whereas the poor doomed QL was Sinclair doing its trademark thing too far: a 16-bit machine cut down to the point that it was no better than a decent 8-bit machine.

Interesting times.

Whereas now, almost all the diversity is gone. Today, we just have generic x86 boxes and occasional weird little ARM things, and apart from some research or hobbyist toys, just 2 OS families -- Windows NT or some flavour of Unix.
liam_on_linux: (Default)
There are moves afoot to implement desktop apps inside containers on Linux -- e.g.

https://wiki.gnome.org/Projects/SandboxedApps/Sandbox

This is connected with the current uptake of Docker. There seems to be a lot of misunderstanding about Docker, exemplified by a mailing list post I just read which proposes running different apps in different user accounts instead and accessing them via VNC. This is an adaptation of my reply.

Corrections welcomed!

Docker is a kind of standardized container for Linux.

Containers are a sort of virtual machine.

Current VMs are PC emulators for the PC: they virtualise the PC's hardware, so you can run multiple OSes at once on one PC.

This is useful if you want to run, say, 3 different Linux distros, Windows and Solaris on the same machine at once.

If you run lots of copies of the same OS, it is very inefficient, as you duplicate lots of code.

Containers virtualise the OS instead of the computer. 1 OS instance, 1 kernel, but to the apps running on that OS, each app has its own OS. Apps cannot see other apps at all. The virtualisation means that each app thinks it is running standalone on the OS, with nothing else installed.

This means that you can, say, run 200 instances of Apache on 1 instance of Linux, and they are all isolated. If one crashes, the others don't. You can mix versions, have custom modules in one that the others don't have, etc.

All without the overhead of running 200 copies of the OS.

Containerising apps is a security measure. It means that if, say, you have a compromised version of LibreOffice that contains an exploit allowing an attacker to get root, they get root in the container, and as far as they can see, the copy of LibreOffice is the only thing on the computer. No browser, no email, no stored passwords, nothing.

All within 1 user account, so that this can be done for multiple users, side-by-side, even concurrently on a multiuser host.

It is nothing to do with user accounts; these are irrelevant to it.

Gobo's approach to bundling apps mainly just brings benefits to the user: an easier-to-understand filesystem hierarchy, and apps that are self-contained not spread out all over the filesystem. Nice, but not a killer advantage. There's no big technical advantage and it breaks lots of things, which is why Gobo needs the gobohide kernel extension and so on. It's also why Gobo has not really caught on.

But now, containers are becoming popular on servers. It's relatively easy to isolate server apps: they have no GUI and often don't interact much with other apps on the server.

Desktop apps are much harder to containerise. However, containerising them brings lots of other advantages -- it could effectively eliminate the differences between Linux distributions, forever ending the APT-vs-RPM wars by making the packaging irrelevant, while delivering much improved security, granularity, simplicity and more.

In theory all Gobo's benefits at the app level (the OS underneath is the same old mess) plus many more.

It looks like it might be something that will happen. It will have some side-effects -- reducing the ease of interapp communication, for instance. It might break sound mixing, or inter-app copy-and-paste, system browser/email/calender integration and some other things.

And systems will need a lot more hard disk space.

But possibly worth it overall.

One snag at present is that current efforts look to require btrfs, and btrfs is neither mature nor popular at the moment. This might mean that we get new filesystems with the features such sandboxing would need -- maybe there'll be a new ext5 FS, or maybe Bcachefs will fit the bill. It's early days, but the promise looks good.
liam_on_linux: (Default)
I was just prodded by someone when suggesting that some friends try Linux. I forgot to mention that you can try it without risking your existing PC setup. It prompted me to write this...

I forget that non-techies don't _know_ stuff like that.

Download a program called VirtualBox. It's free and it lets you run a whole other operating system - e.g. Linux - under Windows as a program. So you can try it out without affecting your real computer.

https://www.virtualbox.org/

If all you know is Windows, I'd suggest Linux Mint: http://www.linuxmint.com/

It has a desktop that looks and works similarly to Windows' classic pre-Win8 look & feel.

Google for the steps but here's the basic instructions:

[1] Download and install VirtualBox

[2] Then download the Virtualbox Extensions from the same site. Double-click the extensions file to install it into Vbox. (They have to do it this way for copyright reasons.)

[3] Download Mint. It comes as an ISO file, an image of a DVD.

[4] Make a new VM in VBox. Give it 2-3 gig of RAM. Enable display 3D acceleration in the settings. (Remember, anything you don't know how to do, Google it.) Leave all the other settings as they are.

[5] Start your new VM. It will ask for an ISO file. Point it at the ISO file of Mint you downloaded.

[6] It will boot and run. Install it onto the virtual hard disk inside Vbox. Just accept all the defaults.

[7] Reboot your new Mint VM.

[8] Install the Vbox Guess Additions. On the VBox Device menu, choose “Insert Guest Additions ISO”. Google for instruction on how to install them.

[9] When it’s finished, reboot the VM.

[10] Update your new copy of Linux Mint. (Remember, Google for instructions.)

That’s it. Play with it. See if you can do the stuff you normally do on Windows all right. If you can’t, Google for what program to use and how to install it. It’s not as quick as a real PC but it works.

Don’t assume that because you know how to do something on Windows, it works that way on Linux. E.g. you never should download programs from a website and install them into Linux — it has a better way. Be prepared to learn some stuff.

If you can work it, then you can install it on your PC alongside Windows. This is called Dual Booting. It’s quite easy really and then you choose whether you want Windows or Linux when you turn it on.

All my PCs do it, but I use Windows about once or twice a year, when I absolutely need it. Which is almost never. I only use Windows if someone is paying me too — it is a massive pain to maintain and keep running properly compared to more grown-up equivalents. (Linux and Mac OS X are based on a late-1960s project; they are very mature and polished. The first version of the current Windows family is from 1993. It’s still got a lot of growing up to do — it’s only half the age.)

It’s genuinely better. No, you don’t get all the Windows programs. There aren’t many games for it, for instance. But it can do anything Windows can do, it’s faster, it’s immune to all the Windows viruses and nasties so you don’t need antivirus or a firewall or anything. That means it’s faster, too — antivirus slows computers down, but you need it on Windows.

All the apps are free. All the updates are free, forever. There are thousands of people on web fora who will help you if you have problems, you just have to ask. It’s educational — you will learn more about computers from learning a different way to use them, but that means you won’t be so helpless. You don’t need to be a white-coated genius scientist, but what it means is you take control back from some faceless corporation. Remember, the world’s richest man got that way by selling people stuff they could have had for free if they just knew how.
liam_on_linux: (Default)
I have ruffled many feathers with my position that the touch-driven computing sector is growing so fast that it's going to subsume the old WIMP model completely. I don't mean that iPads will replace Windows PCs, but that the descendants of the PC will look and act more like tablets than today's desktops and laptops.

But where is it leading, beyond that point? I have absolutely no concrete idea. But the end point? I've read one brilliant model.

It's in one of the later Foundation books by Isaac Asimov, IIRC. (Not a series I'm that enamoured of, actually.)

A guy gets (steals?) a space yacht: a small, 1-man starship. (Set aside the plausibility of this.)

He searches the ship's crew quarters. In its few luxury rooms, there is no cockpit. No controls, no instruments, nothing. He is bemused.

He returns to the comfiest room, the main stateroom, i.e. cabin/bedroom. In it there is a large, bare dressing table with a comfy seat in front of it. He sits.

Two handprints appear, projected on the surface of the desk, shaped in light.

He studies them. They're just hand-shaped spots of light. He puts his hands on them.

And suddenly, he is much smarter. He knows the ship's position and speed in space. He knows where all the nearby planetary bodies are, their gravity wells, the speeds needed to reach them and enter orbit.

Thinking of the greater galaxy, he knows where all the nearby stars are, their masses, their luminosities, their planetary systems. Merely thinking of a planet, he knows its cities, ports, where to orbit it, etc.

All this knowledge is there in his mind if he wants it; if he allows his attention to move elsewhere, it's gone.

He sits back, shocked. His hands lift from the prints on the desk, and it all disappears.

That is the ultimate UI. One you don't know is there.

Any UI where there are metaphors and abstractions and controls you must operate is inferior; direct interaction is better. We've moved from text views of marked-up files with arcane names in folder hierarchies to today: hi-res, full-colour, moving images of fully-formatted documents and images. That's great.

Some people are happily directly manipulating these — drawing and stroking screens with all their fingers, interacting naturally. Push up to see the bottom of a document, tap on items of interest. It's so natural pre-toddlers can do it.

But many old hands still like their pointing hardware and little icons on screen that they can twiddle with their special pointing devices, and they shout angrily that it's more precise and it's tried and tested and it works.

Show them something better, no, it's a toy. OK for idly surfing the web, or reading, or watching movies, but no substitute for the "real thing".

It's a toy and the mere idea that these early versions could in time grow into something that could replace their 4-box Real Computer of System Unit, Monitor, Mouse and Keyboard is a nonsensical piece of idiocy.

Which is exactly what their former bosses and their tutors said about the Mac's UI 30y ago. It's doubtless what they said about the tinker-toy CP/M boxes a decade before that, and so on.

I'm guilty too. I am using a 25y old keyboard on my tiny silent near-unexpandable 2011 Mac mini, attached via a convertor that cost more than the keyboard and about a third as much as the Mac itself. I don't have a tablet; I don't personally like them much. I like my phablet, though. I gave away my Magic Trackpad - I didn't like it.

(And boy did my friends in the FOSS community curse me out for buying a Mac. I'm a traitor and a coward, apparently.)

But although I personally don't want this stuff, nonetheless, I think it's where we're going.

If adding more layers of abstraction to the system means we can remove layers of abstraction from the human-computer interface, then I'm all for it. The more we can remove, the simpler and easier and clearer the computers we can make, the better. And if we can make them really small and cheap and thus give one to every child in the poorer countries of the world — I'd be delighted.

If price was putting Microsoft and Apple out of business and destroying the career of everyone working with Windows, and replacing it all with that nasty cancerous GPL and Big-Brother-like services like Google — still worth it.

liam_on_linux: (Default)
(Repurposed CIX post.)

Don’t get me wrong. I like Apple kit. I am typing right now on an original 1990 Apple Extended II keyboard, attached via a ABD-USB convertor to a Core i5 Mac mini from 2011, running Mac OS X 10.10. It’s a very pleasant computer to work on.

But, to give an example of the issues — I also have an iPhone. It’s my spare smartphone with my old UK SIM in it.

But it’s an iPhone 4. Not a lot of RAM, under clocked CPU, and of course not upgradable.

So I’ve kept it on iOS 6, because I already find it annoyingly slow and iOS 7 would cause a reported 15-25% or more slowdown. And that’s the latest it will run.

Which means that [a] I can’t use lots of iPhone apps as they no longer support iOS 6.x and [b] it doesn’t do any of the cool integration with my Mac, because my Mac needs a phone running iOS 8 to do clever CTI stuff.

My old 3GS I upgraded from iOS 4 to 5 to 6, and regretted it. It got slower & slower and Apple being Apple, *you can’t go back*.

Apple kit is computers simplified for non-computery people. Stuff you take for granted with COTS PC kit just can’t be done. Not everything — since the G3 era, they take ordinary generic RAM, hard disks, optical drives, etc. Graphics cards etc. can often be made to work; you can, with work, replace CPUs and runs OSes too modern to be supported.

But it takes work. If you don’t want that, if you just max out the RAM, put a big disk in and live with it, then it’s fine. I’m old enough that I want a main computer that Just Works and gives me no grief and the Mac is all that and it cost me under £150, used. The OS is of course freeware and so are almost all the apps I run — mostly FOSS.

I like FOSS software. I use Firefox, Adium, Thunderbird, LibreOffice, Calibre, VirtualBox and BOINC. I also have some closed-source freeware like Chrome, Dropbox, TextWrangler and Skype. I don’t use Apple’s browser, email client, chat client, text editor, productivity apps or anything. More or less only iTunes, really.

What this means is that I can use pretty much the same suite of apps on Linux, Mac and Windows, making switching between them seamless and painless. My main phone runs Android, my travelling laptop is a 2nd-hand Thinkpad with the latest Ubuntu LTS on it.

As such, many of the benefits of an all-Apple solution are not available to me — texting and making phone calls from the desktop, seamless handover of file editing from desktop to laptop to tablet, wireless transparent media sync between computers and phone, etc.

I choose not to use any of this stuff because I don’t trust closed file formats and dislike vendor lock-in.

Additionally, I don’t like Apple’s modern keyboards and trackpads, and I like portable devices where I can change the battery or upgrade the storage. So I don’t use Apple laptops and phones and don’t own a tablet. iPads are just big iPhones and I don’t like iPhones much anyway. The apps are too constrained, I hate typing on a touchscreen “keyboard” and I don’t like reading book-length texts from a brightly-glowing screen — I have a large-screen (A4) Kindle for ebooks. (Used off eBay, natch.) TBH I’d quite like a backlight on it but the big-screen model doesn’t offer one.

But I don’t get that with Ubuntu. I never used UbuntuOne; I don’t buy digital content at all, from anyone; my Apple account is around 20 years old and has no payment method set up on it. I have no lock-in to Apple and Ubuntu doesn’t try to foist it on me.

With Ubuntu, *I* choose the laptop and I can (and did) build my own desktops, or more often, use salvaged freebies. My choice of keyboard and mouse, etc. I mean, sure, the Retina iMac is lovely, but it costs more than I’m willing to spend on a computer.

Android is… all right. It’s flakey but it’s cheap, customisable (I’ve replaced web browser, keyboard, launcher and email app, something Apple does not readily permit without drastic limitations) and it works well enough.

But it’s got bloatware, tons of vendor-specific extensions and it’s not quick.

Ubuntu is sleek as Linuxes go. I like the desktop. I turn off the web ads and choose my own default apps and it’s perfectly happy to let me. I can remove the built-in ones if I want and it doesn’t break anything.

If I could get a phone that ran Ubuntu, I’d be very interested. And it might tempt me into buying a tablet.

I’ve tried all the leading Linuxes (and most of the minor ones) and so long as you’re happy with its desktop, Ubuntu is the best by a country mile. It’s the most polished, best-integrated, it works well out of the box. I more or less trust them, as much as I trust any software vendor.

The Ubuntu touch offerings look good — the UI works well, the apps look promising, and they have a very good case for the same apps working well on phone and tablet, and the tablet becoming a usable desktop if you just plug a mouse in.

Here’s a rather nice little 3min demo:
https://www.youtube.com/watch?v=c3PUYoa1c9M

Wireless mouse turned on: desktop mode, windows, title bars, menus, etc.
Turn it off, mid-session: it’s a tablet, with touch controls. *With all the same same apps and docs still open.*
Mouse back on: it’s in desktop mode again.

And there’s integration — e.g. phone apps run full-size in a sidebar on a tablet screen, visible side-by-side with tablet apps.

Microsoft doesn’t have this, Apple doesn’t, Google doesn’t.

It looks promising, it runs on COTS hardware and it’s FOSS. What’s not to like?

I suspect, when the whole plan comes together, that they will have a compelling desktop OS, a compelling phone OS and a compelling tablet OS, all working very well together but without any lock-in. That sounds good to me and far preferable to shelling out thousands on new kit to achieve the same on Apple’s platform. Because C21 Apple is all about selling you hardware — new, and regularly replaced, too — and then selling you digital content to consume on it.

Ubuntu isn’t. Ubuntu’s original mission was to bring Linux up to the levels of ease and polish of commercial OSes.

It’s done that.

Sadly, the world failed to beat a path to its door. It’s the leading Linux and it’s expanded the Linux market a little, but Apple beat it to market with a Unix that is easier, prettier and friendlier than Windows — and if you’re willing to pay for it, Apple makes nicer hardware too.

But now we’re hurtling into the post-desktop era. Apple is leading the way; Steve Jobs finally proved his point that he knew how to make a tablet that people wanted and Bill Gates didn’t. Gates’ company still doesn’t, even when it tries to embrace and extend the iPad type of device: millions of the original Surface tablets are destined for landfill like the Atari ET game and Apple Lisa. (N.B. *not* the totally different Surface Pro, but people use it as a lightweight laptop.)

But Apple isn’t trying to make its touch devices replace desktops and laptops — it wants to sell both.

Ubuntu doesn’t sell hardware at all. So it’s trying to drag proper all-FOSS Linux kicking and screaming into the twenty-twenties: touch-driven *and* by desk-bound hardware-I/O, equally happy on ARM or x86-64, very shiny but still FOSS underneath.

The other big Linux vendors don’t even understand what it’s trying to do. SUSE does Linux servers for Microsoft shops; Red Hat sells millions of support contracts for VMs in expensive private clouds. Both are happy doing what they’re doing.

Whereas Shuttleworth is spending his millions trying to bring FOSS to the masses.

OK, what Elon Musk is doing is much much cooler, but Shuttleworth’s efforts are not trivial.
liam_on_linux: (Default)
They're a bit better in some ways. It's somewhat marginal now.

OK. Position statement up front.

Anyone who works in computers and only knows one platform is clueless. You need cross-platform knowledge and experience to actually be able to assess strengths, weaknesses, etc.

Most people in IT this century only know Windows and have only known Windows. This means that the majority of the IT trade are, by definition, clueless.

There is little real cross-platform experience any more, because so few platforms are left. Today, it's Windows NT or Unix, running on x86 or ARM. 2 families of OS, 2 families of processor. That is not diversity.

So, only olde phartes, yeah like me, who remember the 1970s and 1980s when diversity in computing meant something, have any really useful insight. But the snag with asking olde phartes is we're jaded & curmudgeonly & hate everything.

So, this being so...

The Mac's OS design is better and cleaner, but that's only to the extent of saying New York City's design is better and cleaner than London's. Neither is good, but one is marginally more logical and systematic than the other.

The desktop is much simpler and cleaner and prettier.

App installation and removal is easier and doesn't involve running untrusted binaries from 3rd parties, which is such a hallmark of Windows that Windows-only types think it is normal and natural and do not see if for the howling screaming horror abomination that it actually is. Indeed, put Windows types in front of Linux and they try to download and run binaries and whinge when it doesn't work. See comment about cluelessness above.

(One of the few places where Linux is genuinely ahead -- far ahead -- today is software installation and removal.)

Mac apps are fewer in number but higher in quality.

The Mac tradition of relative simplicity has been merged with the Unix philosophy of "no news is good news". Macs don't tell you when things work. They only warn you when things don't work. This is a huge conceptual difference from the VMS/Windows philosophy, and so, typically, this goes totally unnoticed by Windows types.

Go from a Mac to Windows and what you see is that Windows is constantly nagging you. Update this. Update that. Ooh you've plugged a device in. Ooh, you removed it. Hey it's back but on a different port, I need a new driver. Oh the network's gone. No hang on it's back. Hey, where's the printer? You have a printer! Did you know you have an HP printer? Would you like to buy HP ink?

Macs don't do this. Occasionally it coughs discreetly and asks if you know that something bad happened.

PC users are used to it and filter it out.

Also, PC OSes and apps are all licensed and copy-protected. Everything has to be verified and approved. Macs just trust you, mostly.

Both are reliable, mostly. Both just work now, mostly. Both rarely fail, try to recover fairly gracefully and don't throw cryptic blue-screens at you. That difference is gone.

But because of Windows' terrible design and the mistakes that the marketing lizards made the engineers put in, it's howlingly insecure, and vastly prone to malware. This is because it was implemented badly.

Windows apologists -- see cluelessness -- think it's fine and it's just because it dominates the market. This is because they are clueless and don't know how things should be done. Ignore them. They are loud; some will whine about this. They are wrong but not bright enough to know it. Ignore them.

You need antimalware on Windows. You don't on anything else. Antimalware makes computers slower. So, Windows is slower. Take a Windows PC, nuke it, put Linux on it and it feels a bit quicker.

Only a bit 'cos Linux too is a vile mess of 1970s crap. If it still worked, you could put BeOS on it and discover, holy shit wow lookit that, this thing is really fsckin' fast and powerful, but no modern OS lets you feel it. It's under 5GB of layered legacy crap.

(Another good example was RISC OS. Today, millions of people are playing with Raspberry Pis, a really crappy underpowered £25 tiny computer that runs Linux very poorly. Raspberry Pis have ARM processors. The ARM processor's original native OS, RISC OS, still exists. Put RISC OS on a Raspberry Pi and suddenly it's a very fast, powerful, responsive computer. Swap the memory card for Linux and it crawls like a one-legged dog again. This is the difference between an efficient OS and an inefficient one. The snag is that RISC OS is horribly obsolete now so it's not much use, but it does demonstrate the efficiency of 1980s OSes compared to 1960s/1970s ones with a few decades of crap layered on top.)

Windows can be sort of all right, if you don't expect much, are savvy, careful and smart, and really need some proprietary apps.

If you just want the Interwebs and a bit of fun, it's a waste of time and effort, but Windows people think that there's nothing else (see clueless) and so it survives.

Meanwhile, people are buying smartphones and Chromebooks which are good enough if you haven't drunk the cool-aid.

But really, they're all a bit shit, it's just that Windows is a bit shittier but 99% of computers run it and 99% of computer fettlers don't know anything else.

Once, before Windows NT, but after Unix killed the Real Computers, Unix was the only real game in town for serious workstation users.

Back then, a smart man wrote:

“I liken starting one’s computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.” — Ken Pier, Xerox PARC
That was 30y ago. Now, Windows is like that. Unix is the same but you have air-conditioning and some shots and all the Big Macs you can eat.

It's a horrid vile shitty mess, but basically there's no choice any more. You just get to choose the flavour of shit you will roll in. Some stink slightly less.
liam_on_linux: (Default)
I had a brief play with one on my last trip through Stansted Airport back to Czechia. I disliked the feel of the keyboard, then I realised how very fast & accurate it had been on the few test lines that I had typed.

As the only sensible-sized smartphone on the market today with an actual hardware keyboard, I'm very tempted. I'm also kinda fed up with Android.

With a little luck, my Note 2 might still have some resale value, too.

Unfortunately, all the reviews I can find are dreck like this:

BlackBerry Passport review: Getting stuff done or getting in the way?
By Dan Seifert  on September 24, 2014 10:00 am  Email @dcseifert

It contains a lot of the typical bollocks that normally makes me denigrate smartphone reviews.

Whinge whinge it's too big whinge no Instagram whinge no Snapchat whinge no $shitty_proprietary_bullshit_toy_chat_app whinge videos don't look nice whinge.

No archiving in Gmail is a slight snag, but unlike Dan, I understand folders and filters and they do 99% of my archiving for me, so I don't care that much.

Well I am not a hormonal teenager who wants to give or get cock-shots. I don't give a flying fuck about Snapchat, Instagram or any of that puerile drivel.

I don't watch videos on my phone, because it's a tool not a toy, but I type on it all the time. I detest virtual keyboards. I'm a middle-aged bloke with proper big man-sized hands; I can use a Galaxy Note 2 one-handed, no problem, and if one of these many little nappy-wearing pseudo-journos with the hands of a 12 year old girl can't grip it, that's a good thing because I can't use tiny crappy toys like normal iPhones. The 6+ is the first ever iPhone that is remotely big enough to be usable to me, and it's too thin and its battery too weedy. I want an inch-thick phone with circa 5 amp-hours in it, like I had 6 or 7y ago, please, not some svelte buttonless hairdressers' phone.

So, not very helpful review, directly, inasmuch as the man-child who wrote it clearly wants something I'd perceive as a teen's plaything. I am the kind of boring old pharte with a job to do that he tries & utterly fails to imagine being.

But they're all like that, the Passport reviews. They're by bloody children who regard Flappy Bird as a mission-critical app.

But, OTOH, while Mr Still-Spattered-With-Spit-From-School there can't swap images of his small, soft and as-yet hairless genitals with his other playmates on it, he does manage to tell me that it's big, boring, solid and wide. These are good things.

My Note 2 is if anything too small. It doesn't reach from ear to mouth, as a proper phone should, it has no physical buttons, and at 2y old its battery lasts about 4-6h.

(So does its 1y old replacement battery.) But it's too wide, because it's made for watching videos on, and it wastes space on a pointless stylus when really I want it 1cm thicker with a QWERTY keyboard and in an ideal world 2 SIM slots and 2 batteries.

Really, I want a big bricklike Nokia Communicator (or at a push an HTC Universal; mine had an inch-thick 4800 mAh battery, weighed 450g & was the last smartphone I owned with a good battery life)... but with a modern OS.

Sadly, though, all the phone companies are too busy wanking over leaked pictures of Apple products and making shitty compromised me-too toys to produce something for aging adults with dimming eyesight and big hands.

I was just wondering if the last bastion of vaguely sensible boring phones had made something worth buying.
liam_on_linux: (Default)
I have been meaning to try Arch Linux for years.

As a former RPM user, once I finally made the switch to Ubuntu, more or less exactly 10y ago, well, since then, I have become so wedded to APT that I hesitate with non-APT distros.

My spare system on this machine is Crunchbang, which I like a lot, but is a bit too Spartan in its simplicity for me. Crunchbang is based on the stable version of Debian, which gives it one big advantage on my 2007-era built-for-Windows-Vista hardware: it uses a version of X.org so old that the ATI fglrx drivers for my Radeon HD 3470 GPU still work, which they haven't done on Ubuntu for 2 years now.

But there was a spare partition or 2 waiting. I tried Elementary -- very pretty, but the Mac OS X-ness is just skin-deep; it's GNOME 3, very simplified. No ta. Deepin is too slow and doesn't really offer anything I want -- again, it's a modification of GNOME 3, albeit an interesting one. Same goes for Zorin-OS. I've tried Bodhi before -- it's interesting, but not really pretty to my eyes. (Its Enlightenment desktop is all about eye-candy; as a desktop, it's just another Windows Explorer rip-off. If it shipped with a theme that made it look like one of those shiny floaty spinny movie-computer UIs, I might go for it, but it doesn't, it's all lairy glare that only a teenage metalhead could love.) Fedora won't even install; my partitioning is too complex for its installer to understand. SUSE is a bit bloaty for my tastes, and I don't like KDE (or GNOME 3), which also rules out PCLinuxOS and Deepin.

So Arch was the next logical candidate...

I've been a bit sheepish since an Imaginary Internet Friend, Ric Moore, tried it with considerable success a month or two ago. (As I write, he's in hospital having a foot amputated. I've been thinking of him tonight & I hope he's doing well.)

So I have finally done it. Downloaded it, burned it to a CD -- yes, it's that small -- installed it on one of my spare partitions and I am in business.

After a bit of effort and Googling, I found a simple walkthrough, used it, got installed -- and then discovered that Muktware only tells you about KDE, and assumes you'll use that and nothing else. I don't care for KDE in its modern versions, so I went with Xfce.

Getting a DM working was non-trivial but now I have LXDM -- the 3rd I tried -- and it works. I have an XFCE4 desktop with the "goodies" extras, Firefox, a working Internet connection via Ethernet, and not much else.

It does feel very quick, though, I must give it that. Very snappy. I guess now begins the process of hunting down all the other apps that I use until I've replicated all my basic toolset.

The install was a bit fiddly, much more manual than anything I've done since the mid-1990s, but actually, it all went on very smoothly, considering that it's a lot of hand-entered commands which actually do not seem to depend much on your particular config.
liam_on_linux: (Default)
[Recycled (part of) a mailing list post: another crack at trying to explain what was significant about LispMs.]

One of the much-ignored differences between different computer architectures is the machine language, the Instruction Set Architecture (ISA). It's a key difference. And the reason it doesn't get much attention is that these days, there's really only one type left: the C machine.

There used to be quite a diversity -- there were various widely-divergent CISC architectures, multiple RISC ones, Harvard versus von Neumann designs, stack machines versus register machines, and so on.

Most of that has gone now -- either completely disappeared, or shrunk into very specific niches.
Read more... )
liam_on_linux: (Default)
Long time, no post. This is because since April, I have started a new job where I actually get paid to write technical stuff for a living.

(Hint - I'm going to have to change that usericon...)

Anyway, this subject came up in conversation with my colleague Pavel recently. In my department, there are some Vi[m] advocates, at least one Emacs user in the wild (approach with caution), and when I said I used Gedit from choice, I got pitying looks. :¬)

Which gave me a chance to have my usual rant about the deep and abiding nastiness of both Vi and Emacs, which did at least provide some amusement. It also led Pavel to ask, quite reasonably, what I did want from a console/shell text editor that wasn't provided by, say, Joe, Nano or Pico.

I said CUA and then had to explain what CUA was, and pointed at SETedit, which I've linked to before. Sadly, it hasn't been updated in a while. Packages are only for old versions of popular distros.
http://setedit.sourceforge.net/

This led him to look thoughtful and go off and do some digging. He came back with some gems.

Firstly, there's the rather fun Text Editors Wiki, which is not as comprehensive as it might be but has a lot of interesting reading.
http://texteditors.org/cgi-bin/wiki.pl

First, he pointed me at XWPE. It certainly looks the part, but sadly the project seems to have died. I did get it running on Fedora 20 by installing some extra libraries and symlinking them to names XWPE wanted, but it crashes very readily.
http://www.identicalsoftware.com/xwpe/

After some more hunting, he also found eFTE, enhanced FTE. I rather like this. Not all the shortcuts do what I expect, but it works well nonetheless.
http://sourceforge.net/projects/efte/

Incidentally, eFTE seems to be a fork of a no-longer-maintained older editor, FTE:
http://fte.sourceforge.net/

More recently, I've also discovered Tilde. It is currently maintained and has recent packages available. It looks a bit richer than eFTE, but sadly, the Alt key doesn't work in a window. Clearly this is a known issue as there's a workaround using Esc instead, but it makes it 2 keystrokes rather than one with a modifier.
http://os.ghalkes.nl/tilde/

I remain surprised that these things are obscure & little-known. I'd have thought that given how many people are moving from other OSes to Linux, a lot more MICROS~1 émigrés would have wanted such tools.
liam_on_linux: (Default)
I am already sick and tired of listening to clueless noobs who think they're techies saying "XP is fine, stop worrying" or "I don't do Linux, it's too different" or "I tried it in 2002 and it was rubbish".

Well it's longer since Ubuntu came out (2004) than the gap from Windows for Workgroups 3.11 to Windows XP. Remember the extent of those changes? Well Linux changes a lot faster.

And so what if it's not the same? It's not like changing from a car to a motorbike. It's not that different any more.

To a techie who looks under the hood, who does their own maintenance, sure, it's going from petrol car to electric bike or something. Very little in common.

To someone who uses a desktop, browses the web, plays some simple Flash games or Solitaire etc., occasionally opens a PDF and prints it, or opens an MS Office doc, completes a form and sends it back, stuff like that, then an appropriately-chosen Linux is more like WinXP than Win8 is by far.

But there is galloping fear of the alien in IT. Probably, I suspect, because there are millions of people working in IT who know nothing at all except MS Windows. Everything else is foreign to them, alien and terrifying, and they instantly react like a pod-person in the 1970s Invasion of the Body Snatchers.



And you know what? It's appropriate, because you're all fucking pod people. Stamped out, copies, clones, with no originality and no imagination. You're neophobic.

Anyone who actually knows about computers - real tech people - can handle any OS, any machine. I've troubleshot and fixed problems with machines I have never seen or heard of before; I've sorted out stuff on AS/400 and IBM System/360 mainframes, I've got a DEC PDP-11 talking and doing file exchange with a classic Mac running System 7, and before I walked in, I had never even seen a PDP-11 in my life before.

Software is an office supply, like paperclips. Today it all does much the same, in much the same way.

Imagine the contempt you'd feel for someone who bleated and whinged and complained that they were given a different brand of stapler, or they had to change filing cabinets to one where the keyhole's on the other side. You'd sneer at someone who demanded a training course to help them adjust.

Anyone who only knows one platform, one OS, is not a techie at all, not of any type. If you can't drive half a dozen kinds of computers then you can't drive.

Any competent biker could switch from a 125 trailie to a Gold Wing to a superbike and not kill themselves when they twisted the throttle. They'd go with respect and care and caution, not bleat like an infant because one of the buttons on the handlebars had moved and was a different colour.

So bloody well grow up.

Linux is an answer to a problem. If you have an old XP computer, no it is not safe to use it any more, and yes, you should replace it. But if you get a new one with Win8, it will be very very different indeed. They don't even have a BIOS any more, let alone a bloody Start menu.

And if you don't have the money for a new one, well, an old XP computer won't run Windows 7 properly, no.

But there's a perfectly good alternative that is faster, simpler, safer, more secure, more reliable and it doesn't even cost anything. It'll run on anything XP runs on, works great and all you have to do is get your hands oily. Use Google. Don't think "I know this." You don't. No, not all computers install software by downloading a binary and running it - in fact, that's a fucking stupid design, which spreads malware. Not all computers use a website for updates - that's a fucking stupid design, too.

So grow a pair. Stop whinging. Google "how to install skype ubuntu" rather than downloading and fucking about and breaking it. Download "how to enable GeForce 240 ubuntu 12.04" before you go wasting time. Google "transfer IE bookmarks Firefox" or "libreoffice excel compatibility" or whatever.

Don't assume you know. Assume you don't. You have the entire world's information resource at your fingertips. Use it. Ask the Internet. Ask bloody Ixion.

But stop fucking whinging that "it doesn't run my copy of Anus Invaders 6" or "my crappy plastic £30 printer from PC World doesn't work" and buy a better one. It's cheaper than a new ink cartridge anyway.

Learn. Life is learning. Life is growth. Stop acting like a corpse and live.
liam_on_linux: (Default)
Frankly, coming from a background in 1980s and 1990s OSes, I think modern ones are appalling shite. They're huge, baggy, flabby sacks of crap that drag themselves around leaving a trail of slime and viscera - but like some blasphemous shoggoth, they have organs to spare, and the computers they run on are so powerful and have so much storage that the fact that these disgusting shambling zombie Frankenstein's-monster things, stitched together from bits of the dead, dropping eyeballs and fingers, actually work for weeks on end.

On the server, no problem, run hundreds of instances of them, so when they implode, spawn another.

It's crap. It's all terrible, blatantly obvious utter crap, but there's almost nobody left who remembers any other way. I barely do, from old accounts, & I'm near 50.

We have layers of sticking-plaster and bandages over kernels that are hugely-polished turds, moulded into elegant shapes. These are braindead but have modules for every conceivable function and so can run on almost anything and do almost anything, so long as you don't mind throwing gigabytes and gigahertz at the problem.

And those shiny turds are written in braindead crap languages, designed for semi-competent poseurs to show off their manliness by juggling chainsaws: pointless Byzantine wank like pointer arithmetic, missing basic types for strings, array bounds-checking, and operator overloading. Any language that even allows the possibility of a buffer or stack overflow is hopelessly broken and should be instantly discarded. The mere idea of a portable assembly language is a vestige of days when RAM was rationed and programmers needed to twiddle bits directly; it should have been history before the first machine with more than a megabyte of RAM per user was sold.

Computers should be bicycles for the mind. They let us take our existing mental tools and provide leverage, mechanical advantage, to let us do more.

We work in patterns, in sets, in rich symbols; it is how we think and how we communicate. That, then, should be the native language to which our computers aim: the logic of entities and sets of entities, that is, atoms and lists, not allocated blocks of machine storage - that is an implementation detail, it should be out of sight, and if it's visible, then your design is faulty. If you routinely need to access things, then your design is not even wrong.

By the late '50s we had a low-level programming language that could handle this. It's unreadable, but it was only meant to be the low-level; we just never got the higher level wrapper to make it readable to mortals. The gods themselves can work in it; to lesser beings, it's all parens.

Now, we have a rich choice of higher-level wrappers to make it all nice and easy and pretty. Really very pretty.

And later, people built machines specifically to run that language, whose processors understood its primitives.

But they lost out. CPUs were expensive, memory was expensive, so instead, OSes grew simpler; Unix replaced Multics, and CPUs grew simpler too, to just do what these simple OSes written in simple languages did. Result, these simple, stripped-down machines and OSes were way more cost-effective, and they won. The complex machines died out.

Then the simpler machines - which were still quite big and expensive - were stripped down even more, to make really cheap, rudimentary 4-bit CPUs for calculators, ones that fitted on one chip.

They sold like hotcakes, and were developed and refined, from 4-bit to 8-bit, from primitive 8-bit to better 8-bit, with its own de-facto standard OS which was a dramatically simpler version of a simple, obsolete OS for 16-bit minicomputers.

And that chip begat a clunky segmented 8/16-bit one, and that a clunky segmented 16-bit one, and that a bizarre half-crippled 32-bit one that could emulate lots of the 8/16-bit one in hardware FFS. And that redefined the computer industry and it was nearly two decades until we got something slightly better, a somewhat-improved version of the same old same old.

And that's where we are now. The world runs on huge, vastly complex scaled-up go-faster versions of a simplified-to-the-maximum-extent-possible calculator chip. These chips grew out of a project to scale-down simple, dumb, brain-dead chips built to be cheap-but-quick because the proper ones, that people actually liked, were too expensive 40 years ago. Of course, now, the descendants of those simplified chips are vastly more complex than the big expensive ones their ancestors killed off.

And what do we run on them? Two OSes. One a descendant of a quick-n-dirty lab skunkworks project to make an old machine useful for games, still today written in portable assembler with richer portable-assembler things written in the lower-level one running on top of it. And a descendant of a copy of a copy of a primitive '60s mini OS which has been extensively rewritten in order to imitate the skunkworks thing.

But these turds have been polished so brightly, moulded into such pretty shapes, that they've utterly dominated the world since my childhood. It's still all made from shit but it's been refined so much that it looks, smells and tastes quite nice now.

We still are covered in shit and flies - "binaries", "compilers", "linkers", "IDEs", "interpreters", "disk" versus "RAM", "partitions" and "filesystems", all this technical cruft that better systems banished before the first Mac was made, before the 80286 hit the market.

But as the preface to the Unix-Hater's Handbook says:

``I liken starting ones computing career with UNIX, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. BUT, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.''


- Patrick Sobalvarro

Nobody knows any better any more. And when you try to point out that there was once something better, that there are other ways, that it doesn't need to be like this... people just ridicule you.

And no, in case it's not clear, I am not a Lisp zealot. I find it unreadable and cannot write "hello world" in it. I also don't want 1980s Lisp Machines back - they were designed for Lisp programmers, and I'm not one of them.

I want rich modern programming languages, as easy to read as Python, as expressive as Lisp, with deep rich integration into the GUI - not some bolt-on extra like a tool to draw forms and link them to bits of code in 1970s languages. There's no implicit reason that why the same language shouldn't be usable by a non-specialist programmer writing simple imperative code, and also by a master wielding complex class frameworks like a knight with a lightsabre. It's all code to the computer: you should be able to choose your preferred viewing level, low-level homoiconicity or familiar Algol-like structures. There shouldn't be difference between interpreted languages and compiled - it's all the same to the machine. JIT and so on solved this years ago. There's no need for binaries at all - look at Java, look at Taos and Intent Elate, look at Inferno's Limbo and Dis. Hell, look at Forth over 30 years ago: try out a block of code in the interpreter; once it works, name it and bosh, it's compiled and cached.

Let's assume it's all FOSS. No need for licences mandating source distribution: the end-product is all source. You run the source directly, like a BASIC listing for a ZX Spectrum in 1983, but at modern speeds. If you aren't OK with that, you don't like distributing your code, fine, go use a proprietary OS and we wish you well.  Hope it still works on their next version, eh?

It could be better than we have. It should be better than we have. Think the Semantic Web all the way down: your chip knows what a function is, what a variable is, what a string or array is - there's no level transition where suddenly it's all bytes. There doesn't need to be.

And this stuff isn't just for programmers. I'm not a programmer. Your computer should know that a street address is an address, and with a single command you can look up anyone's address that is in any document on your machine - no need to maintain a separate address-book app. It should understand names and dates and amounts of money; there were apps that could do this in the 1980s. That we still need separate "word processors" and "spreadsheets" and "databases" today is a sick joke.

I have clients who keep all their letters in one huge document, one per page or set of pages per correspondant... and there's nothing wrong with that. We shouldn't be forced to use abstractions like files and documents and folders if we don't want to.

I have seen many clients who don't understand what a window is, what a scrollbar does; these abstractions are too complex for them, even for college professors after decades of use of GUIs. That's why iPads are doing so well. You reach out and you pull with a fingertip.

And that's fine, too. The ancestor of the iPad was the Newton, but the Newton that got launched was a crippled little thing; the original plan was a pocket Lisp Machine, with everything in Dylan all the way down to the kernel.

And the ancestor of the Macintosh was Jef Raskin's "information appliance", with a single global view of one big document. Some bits local, some remote; some computed, some entered; some dynamic, some static; with the underlying tools modular and extensible. No files, no programs, just commands to calculate this bit, reformat that bit, print that bit there and send this chunk to Alice and Charlie but not Bob who gets that other chunk.

Sounds weird and silly, but it was, as he said, humane; people worked for millennia on sheets of paper before we got all this nonsense of icons, files, folders, apps, saving, copying and pasting. The ultimate discrete computer is a piece of smart paper that understands what you're trying to do.

And whereas we might be able to get there building on bytes in portable assembler, it will be an awful lot harder, tens to hundreds of times as much work and the result won't be very reliable.
liam_on_linux: (Default)
Apparently, it's the ultimate Linux, and with his tweaks to the current development kernel and a custom scheduler, it's insanely responsive, and if you haven't tried it, you're not a Linux god.

So I said...

Um. Good for you. I am pleased you've found a system you find nicely responsive.

Me, I just want something simple, low-maintenance and reliable, with a
good polished rich UI, that does what I need. The less work I have to
do to achieve this, the better the OS is, for me.

Yours sounds very high-maintenance indeed and I'm not remotely
interested in going to all that work.

I don't consider myself a Linux god. I am reasonably clueful. I've
been using Ubuntu since it came out in 2004, SuSE for a couple of
years before that, Caldera for a couple of years before that. That
followed a good few years on NT 3.51 and NT 4, which followed Windows
95. I switched to Windows 95 from OS/2 - I was a keen OS/2 user from
2.0 to 2.1 to 3.0. It really was the best 32-bit OS for PCs back then.

Before that, at work, I built, ran and supported servers running SCO
Unix and before that SCO Xenix. My Unix experience goes back to about
1988, which is when I switched over from the VAX/VMS I used at
University.

I have also used IBM AIX and SUN SunOS and Solaris, but not much.

Plus Novell Netware - I was a bit of a guru on Netware 2 and 3 but
wasn't so impressed with Netware 4 and have barely used 5. I wrote a
masterclass on building a small-business server with Red Hat 6 for PC
Pro magazine in the late 1990s. I've also reviewed about 20 or 30
Linux distros over the years, so I feel I know the Linux landscape
well.

I'm also very interested in alternative (non-Unix) OSes, especially
for the PC. BeOS is my personal all-time favourite.

Off PC hardware, I'm also pretty good on Mac OS X and classic Mac OS,
before thzat Acorn RISC OS and Psion EPOC and its successor Symbian,
and have some knowledge of AmigaOS, Atari GEM (I was peripherally
involved in the GPL FOSS FreeGEM project to revive PC GEM; my name's
in the credits of FreeDOS, to my startlement.)

I was definitely an MS-DOS guru back in the late 1980s/early 1990s and
supported all the major networking systems - 3Com 3+Share, 3+Open, DEC
Pathworks, AppleShare, Sage MainLAN, Personal Netware, Netware Lite,
NT Server from the very first version, etc.

So I guess you could say that my knowledge is broad but in places
shallow, rather than very deep in any one area, such as Linux. :-)

But I feel really sorry for you if you think that /any/ Linux system
is genuinely fast and responsive. It's not. It's a huge lumbering
sloth of an OS. You really need to try BeOS, or failing that Haiku, if
you want to experience what a fast responsive OS on PC hardware feels
like.

Sadly, there just weren't the apps for it, and no VMs in those days.

And for something vastly more responsive than Haiku, try Acorn's RISC
OS. It's the original OS for the ARM chip that these days struggles to
run bloated leviathans like Apple iOS and Android. RISC OS is the
single most responsive system I've ever used, because the entire core
OS - kernel, GUI, main accessory apps - fits into about 6MB of Flash
ROM.

No, that's not a typo. Six megabytes. Complete Internet-capable
multitasking GUI OS with network clients etc.

It runs on the Raspberry Pi and RISC OS itself is now shared-source
freeware so you can download it from Risc OS Open Ltd. for nothing and
run it on a £25 computer - on which it performs very very well, many
tens of times faster than a lightweight cut-down Linux such as
Raspbian.

So, no, not a Linux god, but, you know, not a n00b either.

Try some of these OSes. Prepare to be surprised. You might enjoy the experience.

Most of them have nice friendly GUI text editors, too, way friendlier
than Vi /or/ Emacs. ;-D
liam_on_linux: (Default)
From this Reg forum...

No, the DOS limits were /much/ earlier and older.

From old old memory:
MS-DOS 1.x didn't support hard disks.
MS-DOS 2.x did, but just one, of up to 10MB.
MS-DOS 3.0 supported a single hard disk partition (per drive) of up to 32MB.
MS-DOS 3.2 supported two partitions per drive, so 2 x 32MB.
MS-DOS 3.3 supported one primary and an extended partition containing as many 32MB "logical drives" as you wanted. (I built an MS-DOS fileserver with a 330MB hard disk once - it had drive letters C:, D:, E:, F:, G:, H:, I:, J:, K: and a leftover 11MB L: drive. Messy as hell but all you could do without 3rd party "disk extenders" such as Golden Bow's one. The server OS was 3Com 3+Share if anyone remembers that.)

Lots of vendors implemented hacks and extensions to allow bigger disks, but they were all mutually incompatible and many failed to work with some 3rd party software. Of course, anything that directly accessed disk data structures, like a defragger or a disk-repair tool such as Norton Utilities was 100% guaranteed to catastrophically corrupt any such extended disk setup.

The one that caught on was Compaq DOS 3.31. It used an extension of FAT16 that allowed bigger clusters - still just 65,535 of them, but multiple 512 byte sectors per cluster, permitting bigger partitions. The max cluster size was 16KiB so the max disk size was 65535*16KiB = 2GiB.

This is the one that IBM adopted into MS-DOS 4 and it became the standard. However, disks over 512MB used inefficient 8KiB clusters - i.e. files were allocated with a granularity of 8KiB and even a 1 byte file took 8KiB. An 8.0001KiB file would take 16KiB.

This became disastrous over 1GiB where the granularity was 16KiB. Roughly 20-30% of disk space would be wasted because of this granularity as inaccessible "slack space".

This was only fixed in Windows 95 OSR2 with FAT32, which permitted huge disks - up to 2TiB - with much finer granularity.

But all of DOS 4, 5 and 6.x permitted disk partitions of up to 2GiB.

February 2026

S M T W T F S
123 4567
891011121314
15161718192021
22232425262728

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 17th, 2026 07:27 pm
Powered by Dreamwidth Studios