liam_on_linux: (Default)
I have just recently discovered that my previous post about Commodore BASIC went modestly viral, not only featuring on Hacknernews but getting its own story on Hackaday.

Gosh.

This in itself has provoked some discussion. It's also resulted in a lot of people telling me that NO COMMODORE WAS TEH AWESOME DONT YOU EVEN and so on, as one might expect. Some hold that the C64's lousy PET BASIC was a good thing because it forced them to learn machine code.

People on every 8-bit home micro who wanted to do games and things learned machine code, and arguably, there is good utility in that. 8-bitters just didn't have the grunt to execute any interpreter that fast, and most of the cut-price home machines didn't have the storage to do justice to compilers.

But for those of us who never aspired to do games, who were just interested in playing around with algorithms, graphics, graphs and fractals and primitive 3D and so on, then there was an ocean of difference between a good-enough BASIC, like Sinclair BASIC, and the stone-age 1970s ones that Commodore shipped, designed for machines that didn't have graphics and sound. I learned BASIC on a PET 4032, but I never wanted a PET of my own -- too big, too expensive, and kinda boring. Well what use is an all-singing all-dancing colour computer with the best music chip on the market if it has PET BASIC with all the sound and pictures and motion of which a PET was capable? (I.e. none.)

I used my Spectrum, got a better Spectrum with more RAM, then got a PCW and learned a bit of CP/M, and then I got an Archimedes and a superb BASIC that was as quick as Z80 assembly on a Spectrum.

But what occurred to me recently was that, as I discovered from ClassicCmp, a lot of Americans barely know that there were other computer markets than the American one. They don't know that there were cheaper machines with comparable capabilities to the C64, but better BASICs (or much better, world-class BASICs.) They don't know that other countries' early-1980s 8-bit BASICs were capable of being rich, powerful tools, for learning advanced stuff like recursion and drawing full-colour high-res fractals using said recursion, entirely in BASIC.

For many people, Atari and Apple were mid-price-range and Commodore were cheap, and MS BASIC was basically all there was.

In the last 30 years, America has largely guided the world of software development. The world runs 2 software ecosystems: the DOS/Windows line (both American, derived from DEC OSes which were also American), and various forms of UNIX (also American).

All the other OSes and languages are mostly dead.

• Ada, the *fast* type-safe compiled language (French)? Largely dead in the market.

• The Pascal/Modula-2/Oberon family, a fast garbage-collected compiled family suitable for OS kernels (Swiss), or the pioneering family of TUI/GUI OSes that inspired Plan 9, Acme, & Go? Largely dead.

• Psion/EPOC/Symbian (British), long-battery-life elegant multitasking keyboard-driven PDAs, & later their the super-fast realtime-capable C++ smartphone OS that could run the GSM comms stack on the same CPU as the user OS? Totally dead.

• Nokia's elegant, long-life, feature-rich devices, the company who popularised first the cellphowe and then the smartphone? Now rebadges Chinese/American kit.

• Acorn RISC OS (British), the original ARM OS, limited but tiny and blindingly fast and elegant? Largely dead.

• DR-DOS, GEM, X/GEM, FlexOS -- mostly the work of DR's UK R&D office? Dead & the American company that inherited the remains didn't properly open-source them.

• possibly the best, richest ever 8-bit word processor LocoScript, pioneering GUI language BASIC+ , first integrated internet suite for Windows Turnpike, all from British Locomotive Software? Dead.

In my early years in this business, in the 1980s and 1990s, there were as many important European hardware and software products as there were American, including European CPUs and European computer makers, and European software on American hardware.

Often, the most elegant products -- the ones that were the most powerful (e.g. the Archimedes), or the most efficient (e.g. Psion), or had the longest battery life (e.g. Nokia) -- all dead and gone, and their products nearly forgotten.

30y ago I had a personal RISC workstation for under $1000 that effortlessly outperformed IBM's fastest desktop computers costing 10x more. British.

25y ago I had excellent multiband mobile phones with predictive text and an IRDA link to my PDA. The phone lasted a week on a charge, and the PDA a month or 2 on 2 AA batteries. British and Finnish.
15y ago I had a smartphone that lasted a few days on a charge, made by the company who made the phone above running software from the PDA company. Finnish.

Now, I have sluggish desktops and sluggish laptops, coupled with phones that barely last a day...

And I think a big reason is that Europe was poorer, so product development was all about efficiency, cost-reduction, high performance and sparing use of resources. The result was very fast, efficient products.

But that's not the American way, which is to generalise. Use the most minimal, close-to-the-metal language that will work. Use the same OS in desktop and mobile. Don't build new OSes -- reuse old ones and old, tried-and-tested tools and methods. Use the same OS on desktop and laptop and server and phone. Moore's Law will catch up and fix the performance.

Its resulted in amazing products of power and bling... but they need teams of tens of thousands to fix the bugs caused by poor languages and 1970s designs, and a gigabyte of updates a month to keep them functional. It's also caused an industry worth hundreds of millions exploiting security holes, both by criminals and by developing-world callcentre businesses prodiving the first-line support these overcomplex products need.

And no, I am not blaming all that on Commdore or the C64! 😃 But I think some of the blame can be pointedf that way. Millions of easily-led kids being shown proof that BASIC is crap and you've got to get close to the metal to make it work well -- all because one dumb company cut a $20 corner too much.
liam_on_linux: (Default)
[A friend asked why, if Lisp was so great, it never got a look-in when Ada was designed.]

My impression is that it’s above all else cultural.

There have long been multiple warring factions depending on deeply-felt beliefs about how computing should be done. EBDCIC versus ASCII, RISC vs CISC, C vs Pascal, etc. Now it’s mostly sorted inasmuch as we all use Unix-like OSes — the only important exception, Windows, is becoming more Unix-like — and other languages etc. are layered on top.

But it goes deeper than, e.g., C vs Pascal, or BASIC or Fortran or whatever. There is the imperative vs functional camp. Another is algebraic expressions versus non-algebraic: i.e. prefix or postfix (stack-oriented RPN), or something Other such as APL/I/J/A+; manual memory management versus automatic with GC; strongly versus weakly typed (and arguably sub-battles such as manifest versus inferred/duck typing, static vs dynamic, etc.)

Mostly, the wars settled on: imperative; algebraic (infix) notation; manual memory management for system-level code and for externally-distributed code (commercial or FOSS), and GC Pascal-style languages for a lot of internal corporate s/w development (Delphi, VB, etc.).

FP, non-algebraic notation and things like were thus sidelined for decades, but are now coming back layered on top of complex OSes written in C-like languages. This is an era of proliferation in dynamic, interpreted or JITTed languages used for specific niche tasks, running on top of umpteen layers of GP OS. Examples range across Javascript, Perl 6, Python, Julia, Clojure, Ruby and tons more.

Meanwhile, new safer members of the broader C family of compiled languages, such as Rust and Go, and stretching a point Swift, are getting attention for more performance-critical app programming.

All the camps have strong arguments. There are no single right or wrong answers. However, cultural pressure and uniformity mean that outside of certain niches, we have several large camps or groups. (Of course, individual people can belong to more than one, depending on job, hobby, whatever.)

C and its kin are one, associated with Unix and later Windows.

Pascal and its kin, notably Object Pascal, Delphi/FPC, another. Basic now means VB and that means .NET family languages, another family. Both have historically mainly been part of the MS camp but now reaching out, against some resistance, into Unix land.

Java forms a camp of its own, but there are sub-camps of non-Java-like languages running on the JVM — Clojure, Scala, etc.

Apple’s flavour of Unix forms another camp, comprising ObjC and Swift, having abandoned outreach efforts.

People working on the development of Unix itself tend to strongly favour C above all else, and like relatively simple, old-fashioned tools — ancient text editors, standalone compilers. This has influenced the FOSS Unix GUIs and their apps.

The commercial desktop app developers are more into IDEs and automation; these days this covers .NET and JVM camps, and spans all OSes, but the Pascal/VM camp are still somewhat linked to Windows.

The people doing niche stuff, for their own needs or their organisations, which might be distributed as source — which covers sysadmins, devops and so on — are more into scripting languages, where there’s terrific diversity.

Increasingly the in-house app devs are just using Java, be they desktop or server apps. Indeed “desktop” apps of this type might now often mean Java server apps generating a remote UI via web protocols and technologies.

Multiple camps and affiliations. Many of them disdain the others.

A summary of how I’m actually addressing your question:

But these ones are the dominant ones, AFAICS. So when a new “safe” “secure” language was being built, “weird” niche things like Lisp, Forth, or APL never had a chance of a look-in. So it came out looking a bit Pascal- and BASIC-like, as those are the ones on the safe, heavily-type-checked side of the fence.

A more general summary:

I am coming to think that there are cultural forces stronger than technical forces involved in language choice.

Some examples I suspect that have been powerful:

Lisp (and FP) are inherently complex to learn and to use and require exceptionally high intelligence in certain focussed forms. Some people perfectly able to be serviceable, productive coders in simple imperative languages find themselves unable to fathom these styles or methods of programming. Their response is resentment, and to blame the languages, not themselves. (Dunning Kruger is not a problem confined to those of low intelligence.)

This has resulted in the marginalisation of these technologies as the computing world became vastly more commoditised and widespread. Some people can’t handle them, and some of them end up in positions of influence, so teaching switched away from them and now students are taught in simpler, imperative languages. Result, there is a general perception that some of these niche tools are exotic, not generally applicable or important, just toys for academics. This isn’t actually true but it’s such a widespread belief that it is self-perpetuating.

This also applies to things like Haskell, ML/OCaml, APL, etc.

On the flip side: programming and IT are male-dominated industries, for no very good reason. This results in masculine patterns of behaviour having profound effects and influences.

So, for instance, languages in the Pascal family have safety as a priority and try to protect programmers from errors, possibly by not allowing them to write unsafe code. A typically masculine response to this is to resent the exertion of oppressive control.

Contrastingly, languages in the BCPL/C/C++ family give the programmer extensive control and require considerable discipline and care to write safe code. They allow programmers to make mistakes which safer languages would catch and prevent.

This has a flip side, though: the greater control potentially permits or offers theoretically higher performance.

This aligns with “manly” virtues of using powerful tools — the appeal of chainsaws, fast cars and motorcycles, big powerful engines, even arguably explicitly dangerous things like knives and guns. Cf. Perl, “the Swiss Army chainsaw”.

Thus, the masculine culture around IT has resulted in people favouring these languages. They’re dangerous in unskilled hands. So, get skilled, then you can access the power.

Of course, again, as Dunning Kruger teach us, people cannot assess their own skill, and languages which permit bugs that others would trap have been used very widely for 3 decades or more, often on the argument of performance but actually because of toxic culture. All OSes are written in them; now as a result it is a truism that only these languages are suitable for writing OSes.

(Ignoring the rich history of OSes in safer languages — Algol, Lisp, Oberon, perhaps even Mesa, or Pascal in the early Macs.)

If you want fast code, you need a fast language! And Real Men use C, and you want to be a Real Man, don’t you?

Cf. the story of Mel The Real Programmer.

Do it in something low-level, manage your own memory. Programming is a game for the smart, and you must be smart because you’re a programmer, so you can handle it and you won’t drop a pointer or overflow an array.

Result, decades of complex apps tackling arbitrary complex data — e.g. Web browsers, modern office suites — written in C, and decades of software patching and updating trying to catch the legions of bugs. This is now simply perceived as how software works, as normal.

Additionally, in many cases, any possible performance benefits have long been lost due to large amounts of protective code, of error-checking, in libraries and tools, made necessary by the problems and inherent fragility of the languages.

The rebellion against it is only in the form of niche line-of-business app developers doing narrow, specific stuff, who are moving to modern interpreted languages running on top of tens of million of lines of C written by coders who are only just able to operate at this level of competence and make lots of mistakes.

For people not facing the pressures of commercial releases, there was an era of using safer, more protective compiled languages for in-company apps — Turbo Pascal, Delphi, VB. But that’s fading away now in favour of Java and .NET, “managed” languages running under a VM, with concomitant loss of performance but slight improvement in safety and reliability.

And because this has been widespread for some 2-3 decades, it’s now just _how things are done_. So if someone presents evidence and accounts of vastly better programmer productivity in other tools, decades ago, in things like Lisp or Smalltalk, then these are discounted as irrelevant. Those are not manly languages for manly programmers and so should not be considered. They’re toys.

People in small enough niches continue to use them but have given up evangelising about them. Like Mac users, their comments are dismissed as fanboyism.

So relatively small cultural effects have created immensely strong cultures, dogmas, about what is or isn’t a good choice for certain categories of problem. People outside those categories continue to use some of these languages and tools, while others languish.

This is immensely sad.

For instance, there have been successful hybrid approaches.

OSes written in Pascal derivatives, or in Lisp, or in Smalltalk, now lost to history. As a result, processor design itself has shifted and companies make processors that run C and C-like languages efficiently, and processors that understood richer primitives — lists, or objects — are now historical footnotess.

And languages which attempted to straddle different worlds — such as infix-notation Lisp derivatives, readable and easily learnable by programmers who only know infix-based, imperative languages — e.g. Dylan, PLOT, or CGOL — are again forgotten.

Or languages which developed down different avenues, such as the families of languages based on or derived from Oberon, or APL, or ML. All very niche.

And huge amounts of precious programmer time and effort expended fighting against limited and limiting tools, not well suited to large complex projects, because they simply do not know that there are or were alternatives. These have been crudely airbrushed out, like disappearing Soviet commissars.

“And so successful was this venture that very soon Magrathea itself became the richest planet of all time, and the rest of the galaxy was reduced to abject poverty. And so the system broke down, the empire collapsed, and a long, sullen silence settled over the galaxy, disturbed only by the pen-scratchings of scholars as they laboured into the night over smug little treatises on the value of a planned political economy. In these enlightened days, of course, no one believes a word of it.”

(Douglas Adams)

May 2025

S M T W T F S
    12 3
45678910
11121314151617
1819 2021222324
25262728293031

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 16th, 2025 03:09 pm
Powered by Dreamwidth Studios