liam_on_linux: (Default)
[personal profile] liam_on_linux
Linux is big business now. Mostly on servers.

The only significant user-facing ones are Android and ChromeOS, which are both dramatically constrained systems, which is part of why they've been successful. It is taking desktop distro vendors way too long to catch up with what they are doing, but distros like Endless, and to a lesser extent Fedora Silverblue, are showing the way:

  • all apps containerised; no inter-app dependencies at all

  • OS image shipped as a complete, tested image

  • Most of the filesystem is read-only

  • no package manager, no end-user ability to install/remove/update packages. You get a whole new OS image periodically, like on a phone.

  • OS updates are transactional: it deploys the whole thing, and if it doesn't work, it rolls back the entire OS to the last known good snapshot. 2+ OS snapshots are maintained at any time so there should always be a good one.


This is a good thing. Software bloat is vast now. OSes are too complex for most people to understand, maintain, or fix. So you don't. Even the ability is removed.

This is in parallel with server deployments:

  • everything is virtualised: OSes only run in VMs with standardised hardware, the network connections are virtualised, the disks are virtualised.

  • VMs are built from templates and deployed automatically as needed, and destroyed again as needed.

  • there is as little as possible local state in any VM. It gets its state info automatically from a database over the network. The database is in a VM too, of course.

  • as few local config files as possible; config is kept in a database too and pushed out to local database instances


I could go on.

Unix is a late-1960s OS designed for late-1960s minicomputers:

  • big standalone non-networkerd servers with lots of small disks, shared by multiple interactive users on dumb text terminals

  • users built their own software from source

  • everything is a text file. Editors and piping are key tools.


With some 1970s tech on top that the industry spent 25 years getting working stably:

  • framebuffers and hi-res graphic displays are possible but very expensive

  • so, design for graphical terminals, or micros that are dedicated display servers

  • programs run over the network, executing on 1 machine, displaying on another

  • Ethernet networking has been bolted on. TCP/IP is the main protocol.

  • because GUIs and networking are add-ons, they break the "everything is a file" model. This is ignored. Editors etc do not allow for it yet alone use it.

  • machines treat one another as hostile. There is no federation, no process migration, etc.


Then in the 1980s this moribund minicomputer OS got a 2nd lease of life and started selling well because microcomputers got powerful enough to run it, growing up into expensive high-power workstations:

  • some effort at network integration: tools were bolted on top for distributing text-only config files automatically, machines could query each other to find resources

  • encryption was added for moving stuff over untrusted networks

  • a lot of focus on powerful programming tools and things like maths tools, 3D modelling tools

  • very little focus on user-friendliness or ease of use, as that sector was dominated by Macs, Amigas etc.

  • much of this stuff is proprietary because of the nature of the business model.

  • server support is half-hearted as there are dedicated server OSes for that


In the 1990s things changed again:

  • plain cheap PCs became powerful enough to run Unix usefully

  • the existing vendors flailed around trying to sell it but mostly failed as they kept their very expensive pricing models from the workstation era

  • FOSS re-implementations replace it, piggybacking on tech developed for Windows

  • After about 1½ decades of work, the leading FOSS *nix becomes a usable desktop OS. Linux wins. FreeBSD trails, but has some good work -- much of this goes into Mac OS X


Early 21st century:

  • high-speed Internet access can be assumed

  • non-technical end-users become a primary "market"

  • now it runs on local 64-bit multi-CPU micros with essentially infinite disk

  • it has a local 3D accelerator for a display


Results...

  • traditional troubleshooting/fault finding is obsolete. No need for keeping admin tools separate from user tools, no need for /bin and /sbin, /usr/bin and /usr/sbin, etc. Boot off a DVD or a USB, recover user data if any, nuke the OS and reload.

  • GUIs favour 3D chrome. When harmony is achieved & everyone standardises on GNOME 2, Microsoft attacks it and destroys it, resulting in vast duplication of desktop functionality and a huge amount of wasted effort.

  • Because of poor app portability between distros, just like in the days of proprietary Unix, only a few big-name apps exist for all distros.

  • Linux is mainly only usable for Web/email/chat/simple office stuff, and traditional coder work. Windows and Mac hoover up all of the rich-local-apps market, including games. Linux vendors do not even notice.

  • Linux on conventional desktops/laptops is weak, but that market is shrinking fast. But...

  • Not-really-Linux-any-more phone/tablet OSes are thriving

  • Consumer Internet use is huge, for content consumption, social networking, and retail


This drives a need for vast server farms, with the lowest possible unit software cost.

  • tools for automation -- for deployment, management, scaling -- are big money

  • because the job market is huge, skill levels are relatively low, so automated distribution of workloads is key:

  • - tools for deploying & re-deploying VM images automatically in case of failure of the contained app

  • - tools for large teams to interwork on incremental, iterative software development

  • - bolting together existing components, automated building and testing and packaging and deployment

  • as the only significant successful end-user apps are web browsers, all tools move onto the web platform:

  • - web mail, web chat,  web media, web file storage, web config management

  • Result: tooling written in Web tools -- JavaScript -- displaying over Web UIs (browser rendering engines)

  • On the server end, inefficiency can be solved by deploying more servers. They're cheap, the software is free.

  • On the client end, most focus is on fast browsers and using games acceleration hardware to deliver fast web browsing, media playback, and hardware accelerated UI


So the only possible method of fighting back and trying to deliver improved end-user tooling for power users is to use a mixture of web tools and games hardware.

Result: OSes that need 3D OpenGL compositing, with desktops and apps written in JavaScript, and packaging and deployment methods taken from those designed for huge server farms.

  • GNOME 3 and Cinnamon, and a distant 3rd, KDE. (The only others are principally defined by refusal to conform.)

  • Flatpak, Snappy and a distant 3rd, Appimage

  • systemd and an increasing move away from text files, including for config and logging -- server farm tools use database connections, because in the 1980s & 1990s, nobody saw any reason to try to copy Microsoft's LAN Manager, domains, Novell NDS, Banyan VINES' Streetalk, or any other more sophisticated LAN management tools.


Gosh. That turned into quite a rant.

Anyway. The Linux desktop is going to continue to move away from familiar *nix ways because they are historical now. Because the Linux desktop is only a tiny parasite on the flank of the vast Linux server market, it gets tooling designed for that.

If you want a more traditional Unix experience, try FreeBSD. It's thriving off the move to systemd and so on.

July 2025

S M T W T F S
  1234 5
6789101112
13141516171819
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 8th, 2025 02:40 pm
Powered by Dreamwidth Studios