liam_on_linux: (Default)
I must be mellowing in my old age (possibly as opposed to bellowing) because I have been getting praise and compliments recently on comments in various places.

Don't worry, there are still people angrily shouting at me as well.

This was the earlier comment, I think... There was a slightly forlorn comment in the Reddit Lisp community, talking about this article, which I much enjoyed myself:

Someone was asking why so few people seemed interested in Lisp.

As an outsider – a writer and researcher delving into the history of OSes and programming languages far more than an actual programmer – my suspicion is that part of the problem is that this positively ancient language has accumulated a collection of powerful but also ancient tooling, and it's profoundly off-putting to young people used to modern tools who approach it.

Let me describe what happened and why it's relevant.

I am not young. I first encountered UNIX in the late 1980s, and UNIX editors at the same time. But I had already gone through multiple OS transitions by then:

[1] weird tiny BASICs and totally proprietary, very limited editors.

[2] early standardised microcomputer OSes, such as CP/M, with more polished and far more powerful tools.

[3] I personally went from that to an Acorn Archimedes: a powerful 32-bit RISC workstation with a totally proprietary OS (although it's still around and it's FOSS now) descended from a line of microcomputers as old as CP/M, meaning no influence from CP/M or the American mainstream of computers. Very weird command lines, very weird filesystems, very weird editors, but all integrated and very powerful and capable.

[4] Then I moved to the same tools I used at work: DOS and Windows, although I ran them under OS/2. I saw the strange UIs of CP/M tools that had come across to the DOS world run up against the new wave of standardisation imposed by (classic) MacOS and early Windows.

This meant: standard layouts for menus, contents of menus, for dialog boxes, for keystrokes as well as mouse actions. UIs got forcibly standardised and late-1980s/early-1990s DOS apps mostly had to conform, or die.

And they did. Even then-modern apps like WordPerfect gained menu bars and changed their weird keystrokes to conform. If their own weird UIs conflicted, then the standards took over. WordPerfect had a very powerful, efficient, UI driven by function keys. But it wasn't compatible with the new standards. It used F3 for help and Escape to repeat a character, command or macro. The new standards said F1 must be help and Esc must be cancel. So WordPerfect complied.

And until the company stumbled, porting to OS/2 and ignoring Windows until it was too late, it worked. WordPerfect remained the dominant industry-standard, even as its UI got modernised. Users adapted.

So why am I talking about this?

Because the world of tools like Emacs never underwent this modernisation.

Like it or not, for 30 years now, there's been a standard language for UIs and so on. Files, windows, the clipboard, cut, copy, paste. Standard menus in standard places and standard commands on them with standard keystrokes.

Vi ignores this. Its fans love its power and efficiency and are willing to learn its weird UI.

Emacs ignores this, for the same reasons. The manual and tutorial talk about "buffers" and "scratchpads" and "Meta keys" and dozens of things that no computer made in 40 years has: a whole different language before the Mac and DOS and Windows transformed the world of computing.

The result of this is that if you read guides and so on about Lisp environments, they don't tell you how to use it with the tools you already know, in terms you're familiar with.

Instead they recommend really weird editors and weird add-ons and tools and options for those editors, all from long before this era of standardization. They don't discuss using Sublime Text or Atom or VS Code: no, it's "well you can use your own editor but we recommend EMACS and SLIME and just learn the weird UI, it's worth it. Trust us."

It's counter-productive and it turns people off.

I propose that a better approach would be to modernize some of the tooling, forcibly make it conform to modern standards. I'm not talking about trivial stuff like CUA-mode, but bigger changes, such as ErgoEmacs. By all means leave the old UI there and make it possible for those who have existing configs to keep it, but update the tools to use standard terminology, use the names printed on actual 21st century keyboards, and editors that work the same way as every single GUI editor out there.

Then once the barrier to entry is lowered a bit, start modernising it. Appearance counts for a lot. "You never get a second chance to make a first impression."

One FOSS tool that's out there is Interlisp Medley. There are efforts afoot to modernise this for current OSes.

How about just stealing the best bits and moving it to SBCL? Modernising its old monochrome GUI and updating its look and feel so it blends into a modern FOSS desktop?

Instead of pointing people at '70s tools like Emacs, assemble an all-graphical, multi-window, interactive IDE on top of the existing infrastructure and make it look pretty and inviting.

Keep the essential Lispiness by all means, but bring it into the 2020s and make it pretty and use standard terms and standard keystrokes, menu layouts, etc. So it looks modern and shiny, not some intimidating pre-GUI-era beast that will take months to learn.

Why bother? Who'll do it?

Well, Linux spent a decade or more as a weird, clunky, difficult and very specialist OS, which was just fine for its early user community... until it started catching up with Windows and Mac and growing into a pretty smooth, polished, quite modern desktop... partly fuelled by server advancements. Things like NetBSD still are and have zero mainstream presence.

Summary: You have to get in there and compete with mainstream, play their game by their rules, if you want to compete.

I'd like to have the option to give Emacs a proper try, but I am not learning an entire new vocabulary and a different UI to do it. I learned dozens of 'em back in the 1980s and it was a breath of fresh air when one standard one swept them all away.

There were very modern Lisp environments around before the rise of the Mac and Windows swept all else away. OpenGenera is still out there, but we can't legally run it any more -- it's IP that belongs to the people who inherited Symbolics when its founders died.

But Interlisp/Medley is still there and it's FOSS now. I think hardcore Lispers see stuff like a Lisp GUI and natively-graphical Lisp editors as pointless bells and whistles – Emacs was good enough for John McCarthy and it still is for me! – but they really are not in 2021.

There were others, too. Apple's Dylan project was built in Lisp, as was the amazing SK8 development environment. They're still out there somewhere.

liam_on_linux: (Default)

[Another repurposed comment from the same Lobsters thread I mentioned in my previous post.]

A serious answer deserved a serious response, so I slept on it, and, well, as you can see, it took some time. I don't even the excuse that "Je n’ai fait celle-ci plus longue que parce que je n’ai pas eu le loisir de la faire plus courte."

If you are curious to do so, you might be amused to look through my older tech-blog posts – for example this or this.

The research project that led to these 3 FOSDEM talks started over a decade ago when I persuaded my editor that retrocomputing articles were popular & I went looking for something obscure that nobody else was writing about.

I looked at various interesting long-gone platforms or technologies – some of the fun ones were Apollo Aegis & DomainOS, SunDew/NeWS, the Three Rivers PERQ etc. – that had or did stuff nothing else did. All were either too obscure, or had little to no lasting impact or influence.

What I found, in time, were Lisp Machines. A little pointy lump in the soil, which as I kept digging turned into the entire Temple of Damanhur. (Anyone who's never heard of that should definitely look it up.) And then as I kept digging, the entire war for the workstation, between whole-dynamic-environment languages (Lisp & Smalltalk, but there are others) and the reverse, the Unix way, the easy-but-somehow-sad environment of code written in a unsafe, hacky language, compiled to binaries, and run on an OS whose raison d'être is to "keep 'em separated": to turn a computer into a pile of little isolate execution contexts, which can only pass info to one another via plain text files. An ugly, lowest-common-denominator sort of OS but which succeeded and thrived because it was small, simple, easy to implement and to port, relatively versatile, and didn't require fancy hardware.

That at one time, there were these two schools – that of the maximally capable, powerful language, running on expensive bespoke hardware but delivering astonishing abilities... versus a cheap, simple, hack of a system that everyone could clone, which ran on cheap old minicomputers, then workstations with COTS 68K chips, then on RISC chips.

(The Unix Haters Handbook was particularly instructive. Also recommended to everyone; it's informative, it's free and it's funny.)

For a while, I was a sort of Lisp zealot or evangelist – without ever having mastered it myself, mind. It breaks my brain. "The Little Lisper" is the most impenetrable computer publication I've ever tried, and failed, to read.

A lot of my friends are jaded old Unix pros, like me having gone through multiple proprietary flavours before coming to Linux. Or possibly a BSD. I won serious kudos from my first editor when I knew how to properly shutdown a Tadpole SPARCbook with:


sync
sync
sync
halt

"What I tell you three times is true!" he crowed.

Very old Unix hands remember LispMs. They've certainly met lots of Lisp evangelists. They got very tired of me banging on about it. Example – a mate of mine said on Twitter:

«
A few years ago it was lisp is the true path. Before that is was touchscreens will kill the keyboard.
»

The thing is, while going on about it, I kept digging, kept researching. There's more to life than Paul Graham essays. Yes, the old LispM fans were onto something; yes, the world lost something important when they were out-competed into extinction by Unix boxes; yes, in the right hands, it achieves undreamed-of levels of productivity and capability; yes, the famous bipolar Lisp programmer essay.

But there are other systems which people say the same sorts of things about. Not many. APL, but even APL fans recognise it has a niche. Forth, mainly for people who disdain OSes as unnecessary bloat and roll their own. Smalltalk. A handful of others. The "Languages of the Gods".

Another thing I found is people who'd bounced off Lisp. Some tried hard but didn't get it. Some learned it, maybe even implemented their own, but were unmoved by it and drifted off. A lot of people deride it – L.I.S.P. = Lotsa Insignificant Stupid Parentheses, etc. – but some of them do so with reason.

I do not know why this. It may be a cultural thing, it may be one of what forms of logic and of reasoning feel natural to different people. I had a hard time grasping algebra as a schoolchild. (Your comment about "grade school" stuff is impenetrable to me. I'm not American so I don't know what "grade school" is, I cannot parse your example, and I don't know what level it is aimed at – but I suspect it's above mine. I failed 'O' level maths and had to resit it. The single most depressing moment of my biology degree was when the lecturer for "Intro to Statistics" said he knew we were all scared, but it was fine; for science undergraduates like us, it would just be revision of our maths 'A' level. If I tried, I'd never even have got good enough exam scores to be rejected for a maths 'A' level.)

When I finally understood algebra, I "got" it and it made sense and became a useful tool, but I have only a weak handle on it. I used to know how to solve a quadratic equation but I couldn't do it now.

I never got as far as integration or differentiation. I only grasped them at all when trying to help a member of staff with her comp-studies homework. It's true: the best way to learn something is to teach it.

Edsger Dijkstra was a grumpy git, but when he said:

“It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration”

... and...

“The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.”

... I kind of know what he meant. I disagree, obviously, and I am not alone, but he did have a core point.

I think possibly that if someone learned Algol-style infix notation when they were young, and it's all they've ever known, if someone comes along and tells them that it's all wrong, to throw it away, and do it like this – or possibly (this(like(do(it)))) – instead, it is perfectly reasonable to reject it.

Recently I used the expression A <> B to someone online and they didn't understand. I was taken aback. This is BASIC syntax and was universal when I was under 35. No longer. I rephrased it as A != B and they understood immediately.

Today, C syntax is just obvious and intuitive. As Stephen Diehl said:

«
C syntax is magical programmer catnip. You sprinkle it on anything and it suddenly becomes "practical" and "readable".
»

I submit that there are some people who cannot intuitively grasp the syntaxless list syntax of Lisp. And others who can handle it fine but dislike it, just as many love Python indentation and others despise it. And others who maybe could but with vast effort and it will forever hinder them.

Comparison: I am 53 years old, I emigrated to the Czech Republic 7 years ago and I now have a family here and will probably stay. I like it here. There are good reasons people still talk about the Bohemian lifestyle.

But the language is terrifying: 4 genders, 7 cases, all nouns have 2 plurals (2-4 & >=5), a special set of future tenses for verbs of motion, & two entire sets of tenses – verb "aspects", very broadly one for things that are happening in the past/present/future but are incomplete, and one for things in the past or present that are complete.

After 6 years of study, I am an advanced beginner. I cannot read a headline.

Now, context: I speak German, poorly. I learned it in 3 days of hard work travelling thence on a bus. I speak passable French after a few years of it at school. I can get by in Spanish, Norwegian and Swedish from a few weeks each.

I am not bad at languages, and I'm definitely not intimidated by them. But learning your first Slavic language in your 40s is like climbing Everest with 2 broken legs.

No matter how hard I try, I will never be fluent. I won't live long enough.

Maybe if I started Russian at 7 instead of French, I'd be fine, but I didn't. But 400 million people speak Slavic languages and have no problems with this stuff.

I am determined. I will get to some useful level if it kills me. But I'll never be any good and I doubt I'll ever read a novel in it.

I put it to you that Lisp is the same thing. That depending on aptitude or personality or mindset or background, for some people it will be easy, for some hard, and for some either impossible or simply not worth the bother. I know many Anglophones (and other first-language speakers) who live in Czechia who just gave up on Czech. For a lot of people, it's just too hard as an adult. My first course started with 15 students and ended with 3. This is on the low side of normal; 60% of students quit in the first 3 months, after paying in full.

And when people say that "look, really, f(a,b) is the same thing as (f a,b)" or tell us that we'll just stop seeing the parentheses after a while (see slides 6 & 7 ) IT DOES NOT HELP. In fact, it's profoundly offputting.

I am regarded as a Lisp evangelist among some groups of friends. I completely buy and believe, from my research, that it probably is the most powerful programming language there's ever been.

But the barrier to entry is very, very high, and it would better serve the Lisp world to recognise and acknowledge this than to continue 6 decades of denialism.

Before this talk, I conferred with 2 very smart programmer friends of mine about the infix/prefix notation issue. ISTM that it should be possible to have a smart editor that could convert between the two, or even round-trip convert a subset of them.

This is why I proposed Dylan on top of Lisp, not just Lisp. Because Lisp frightens people and puts them off, and that is not their fault or failing. There was always meant to be an easier, more accessible form for the non-specialists. Some of my favourite attempts were CGOL and Lisp wizard David A. Moon's PLOT. If Moon thinks it's worth doing, we should listen. You might have heard of this editor he wrote? It's called "Emacs". I hear it's quite something.

liam_on_linux: (Default)
My talk should be on in about an hour and a half from when I post this.

«

A possible next evolutionary step for computers is persistent memory: large capacity non-volatile main memory. With a few terabytes of nonvolatile RAM, who needs an SSD any more? I will sketch out a proposal for how to build an versatile, general-purpose OS for a computer that doesn't need or use filesystems or files, and how such a thing could be built from existing FOSS code and techniques, using lessons from systems that existed decades ago and which inspired the computers we use today.

Since the era of the mainframe, all computers have used hard disks and at least two levels of storage: main memory, or RAM, and secondary or auxiliary storage: disk drives, accessed over some form of disk controller using a file system to index the contents of secondary storage for retrieval.

Technology such as Intel's 3D Xpoint -- sold under the brand name Optane -- and HP's future memristor storage will render this separation obsolete. When a computer's permanent storage is all right there in the processors' memory map, there is no need for disk controllers or filesystems. It's all just RAM.

It is very hard to imagine how existing filesystem-centric OSes such as Unix could be adapted to take full advantage of this, so fundamental are files and directories and metadata to how they operate. I will present the outline of an idea how to build an OS that natively uses such a computer architecture, based on existing technology and software, that the FOSS community is ideally situated to build and develop.
»


It talks about Lisp, Smalltalk, Oberon and A2, and touches upon Plan 9, Inferno, Psion EPOC, Newton, Dylan, and more.

You can download the slides (in PDF or LO ODP format) from the FOSDEM programme entry for the talk.
It is free to register and to watch.

I will update this post later, after it is finished, with links to the video, slides, speaker's notes, etc.

UPDATE:

In theory you should be able to watch the video on the FOSDEM site after the event, but it seems their servers are still down. I've put a copy of my recording on Dropbox where you should be able to watch it.

NOTE: apparently Dropbox will only show the first 15min in its preview. Download the video and play it locally to see the whole, 49min thing. It is in MP4 encoded with H.264.
Unfortunately, in the recording, the short Steve Jobs video is silent. The original clip is below. Here is a transcript:
I had three or four people who kept bugging me that I ought to get my rear over to Xerox PARC and see what they were doing. And so I finally did. I went over there. And they were very kind and they showed me what they were working on.

And they showed me really three things, but I was so blinded by the first one that I didn’t even really see the other two.

One of the things they showed me was object-oriented programming. They showed me that, but I didn’t even see that.

The other one they showed me was really a networked computer system. They had over a hundred Alto computers, all networked using email,
et cetera, et cetera. I didn’t even see that.

I was so blinded by the the first thing they showed me, which was the graphical user interface. I thought it was the best thing I'd ever seen in my life.

Now, remember, it was very flawed. What we saw was incomplete. They’d done a bunch of things wrong, but we didn’t know that at the time. And still, though, they had the germ of the idea was there and they’d done it very well. And within, you know, 10 minutes, it was obvious to me that all computers would work like this someday. It was obvious.


liam_on_linux: (Default)

So, yesterday I presented my first conference talk since the Windows Show 1996 at Olympia, where I talked about choosing a network operating system — that is, a server OS — for PC Pro magazine.

(I probablystill have the speaker's notes and presentation for that somewere too. The intensely curious may ask and I maybe able share it too.)

It seemed to go OK, I had a whole bunch of people asking questions afterwards, commenting or thanking me.

[Edit] Video! https://youtu.be/jlERSVSDl7Y

I have to check out the video recording and make some editing marks before it will be published and I am not sure that the hotel wifi connection is fast or capacious enough for me to do that. However, I'll post it as soon as I can.

Meantime, here is some further reading.

I put together a slightly jokey deck of slides and was very pleasantly impressed at how good and easy LibreOffice Impress made it to create and to present them. You can download the 9MB ODP file here:

https://www.dropbox.com/s/xmmz5r5zfmnqyzm/The%20circuit%20less%20travelled.odp?dl=0

The notes are a 110 kB MS Word 2003 document. They may not always be terribly coherent -- some were extensively scripted, some are just bullet points. For best results, view in MS Word (or the free MS Word Viewer, which runs fine under WINE) in Outline mode. Other programs will not show the structure of the document, just the text.

https://www.dropbox.com/s/7b2e1xny53ckiei/The%20Circuit%20less%20travelled.doc?dl=0

I had to cut the talk fairly brutally to fit the time and did not get to discuss some of the operating systems I planned to. You can see some additional slides at the end of the presentation for stuff I had to skip.

Here's a particular chunk of the talk that I had to cut. It's called "Digging deeper" and you can see what I was goingto say about Taos, Plan 9, Inferno, QNX and Minix 3. This is what the slides on the end of the presentation refer to.

https://www.dropbox.com/s/hstqmjy3wu5h28n/Part%202%20%E2%80%94%20Digging%20deeper.doc?dl=0

Links I mentioned in the talk or slides

The Unix Haters' Handbook [PDF]: https://simson.net/ref/ugh.pdf

Stanislav Datskovskiy's Loper-OS:  http://www.loper-os.org/

Paul Graham's essays: http://www.paulgraham.com/

Notably his Lisp Quotes: http://www.paulgraham.com/quotes.html

Steve Jobs on the two big things he missedwhen he visited Xerox PARC:
http://www.mac-history.net/computer-history/2012-03-22/apple-and-xerox-parc/2

Alan Kay interview where he calls Lisp "the Maxwell's Equations of software": https://queue.acm.org/detail.cfm?id=1039523

And what that means: http://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-equations-of-software/

"In the Beginning was the Command Line" by Neal Stephenson: http://cristal.inria.fr/~weis/info/commandline.html

Author's page: http://www.cryptonomicon.com/beginning.html


Symbolics OpenGenera: https://en.wikipedia.org/wiki/Genera_(operating_system)

How to run it on Linux (some of several such pages):
http://www.jachemich.de/vlm/genera.html
https://loomcom.com/genera/genera-install.html

A brief (13min) into to OpenGenera by Kalman Reti: https://www.youtube.com/watch?v=o4-YnLpLgtk&t=5s
A longer (1h9m) talk about it, also by him: https://www.youtube.com/watch?v=OBfB2MJw3qg

liam_on_linux: (Default)
This is a repurposed CIX comment. It goes on a bit. Sorry for the length. I hope it amuses.

So, today, a friend of mine accused me of getting carried away after reading a third-generation Lisp enthusiast's blog. I had to laugh.

The actual history is a bit bigger, a bit deeper.

The germ was this:

https://www.theinquirer.net/inquirer/news/1025786/the-amiga-dead-long-live-amiga

That story did very well, amazing my editor, and he asked for more retro stuff. I went digging. I'm always looking for niches which I can find out about and then write about -- most recently, it has been containers and container tech. But once something goes mainstream and everyone's writing about it, then the chance is gone.

I went looking for other retro tech news stories. I wrote about RISC OS, about FPGA emulation, about OSes such as Oberon and Taos/Elate.

The more I learned, the more I discovered how much the whole spectrum of commercial general-purpose computing is just a tiny and very narrow slice of what's been tried in OS design. There is some amazingly weird and outré stuff out there.

Many of them still have fierce admirers. That's the nature of people. But it also means that there's interesting in-depth analysis of some of this tech.

It's led to pieces like this which were fun to research:

http://www.theregister.co.uk/Print/2013/11/01/25_alternative_pc_operating_systems/

I found 2 things.

One, most of the retro-computers that people rave about -- from mainstream stuff like Amigas or Sinclair Spectrums or whatever -- are actually relatively homogenous compared to the really weird stuff. And most of them died without issue. People are still making clone Spectrums of various forms, but they're not advancing it and it didn't go anywhere.

The BBC Micro begat the Archimedes and the ARM. Its descendants are everywhere. But the software is all but dead, and perhaps justifiably. It was clever but of no great technical merit. Ditto the Amiga, although AROS on low-cost ARM kit has some potential. Haiku, too.

So I went looking for obscure old computers. Ones that people would _not_ read about much. And that people could relate to -- so I focussed on my own biases: I find machines that can run a GUI or at least do something with graphics more interesting than ones before then.

There are, of course, tons of the things. So I needed to narrow it down a bit.

Like the "Beckypedia" feature on Guy Garvey's radio show, I went looking for stuff of which I could say...

"And why am I telling you this? Because you need to know."

So, I went looking for stuff that was genuinely, deeply, seriously different -- and ideally, stuff that had some pervasive influence.

Read more... )
And who knows, maybe I’ll spark an idea and someone will go off and build something that will render the whole current industry irrelevant. Why not? It’s happened plenty of times before.

And every single time, all of the most knowledgeable experts said it was a pointless, silly, impractical flash-in-the-pan. Only a few nutcases saw any merit to it. And they never got rich.
liam_on_linux: (Default)
[A friend asked why, if Lisp was so great, it never got a look-in when Ada was designed.]

My impression is that it’s above all else cultural.

There have long been multiple warring factions depending on deeply-felt beliefs about how computing should be done. EBDCIC versus ASCII, RISC vs CISC, C vs Pascal, etc. Now it’s mostly sorted inasmuch as we all use Unix-like OSes — the only important exception, Windows, is becoming more Unix-like — and other languages etc. are layered on top.

But it goes deeper than, e.g., C vs Pascal, or BASIC or Fortran or whatever. There is the imperative vs functional camp. Another is algebraic expressions versus non-algebraic: i.e. prefix or postfix (stack-oriented RPN), or something Other such as APL/I/J/A+; manual memory management versus automatic with GC; strongly versus weakly typed (and arguably sub-battles such as manifest versus inferred/duck typing, static vs dynamic, etc.)

Mostly, the wars settled on: imperative; algebraic (infix) notation; manual memory management for system-level code and for externally-distributed code (commercial or FOSS), and GC Pascal-style languages for a lot of internal corporate s/w development (Delphi, VB, etc.).

FP, non-algebraic notation and things like were thus sidelined for decades, but are now coming back layered on top of complex OSes written in C-like languages. This is an era of proliferation in dynamic, interpreted or JITTed languages used for specific niche tasks, running on top of umpteen layers of GP OS. Examples range across Javascript, Perl 6, Python, Julia, Clojure, Ruby and tons more.

Meanwhile, new safer members of the broader C family of compiled languages, such as Rust and Go, and stretching a point Swift, are getting attention for more performance-critical app programming.

All the camps have strong arguments. There are no single right or wrong answers. However, cultural pressure and uniformity mean that outside of certain niches, we have several large camps or groups. (Of course, individual people can belong to more than one, depending on job, hobby, whatever.)

C and its kin are one, associated with Unix and later Windows.

Pascal and its kin, notably Object Pascal, Delphi/FPC, another. Basic now means VB and that means .NET family languages, another family. Both have historically mainly been part of the MS camp but now reaching out, against some resistance, into Unix land.

Java forms a camp of its own, but there are sub-camps of non-Java-like languages running on the JVM — Clojure, Scala, etc.

Apple’s flavour of Unix forms another camp, comprising ObjC and Swift, having abandoned outreach efforts.

People working on the development of Unix itself tend to strongly favour C above all else, and like relatively simple, old-fashioned tools — ancient text editors, standalone compilers. This has influenced the FOSS Unix GUIs and their apps.

The commercial desktop app developers are more into IDEs and automation; these days this covers .NET and JVM camps, and spans all OSes, but the Pascal/VM camp are still somewhat linked to Windows.

The people doing niche stuff, for their own needs or their organisations, which might be distributed as source — which covers sysadmins, devops and so on — are more into scripting languages, where there’s terrific diversity.

Increasingly the in-house app devs are just using Java, be they desktop or server apps. Indeed “desktop” apps of this type might now often mean Java server apps generating a remote UI via web protocols and technologies.

Multiple camps and affiliations. Many of them disdain the others.

A summary of how I’m actually addressing your question:

But these ones are the dominant ones, AFAICS. So when a new “safe” “secure” language was being built, “weird” niche things like Lisp, Forth, or APL never had a chance of a look-in. So it came out looking a bit Pascal- and BASIC-like, as those are the ones on the safe, heavily-type-checked side of the fence.

A more general summary:

I am coming to think that there are cultural forces stronger than technical forces involved in language choice.

Some examples I suspect that have been powerful:

Lisp (and FP) are inherently complex to learn and to use and require exceptionally high intelligence in certain focussed forms. Some people perfectly able to be serviceable, productive coders in simple imperative languages find themselves unable to fathom these styles or methods of programming. Their response is resentment, and to blame the languages, not themselves. (Dunning Kruger is not a problem confined to those of low intelligence.)

This has resulted in the marginalisation of these technologies as the computing world became vastly more commoditised and widespread. Some people can’t handle them, and some of them end up in positions of influence, so teaching switched away from them and now students are taught in simpler, imperative languages. Result, there is a general perception that some of these niche tools are exotic, not generally applicable or important, just toys for academics. This isn’t actually true but it’s such a widespread belief that it is self-perpetuating.

This also applies to things like Haskell, ML/OCaml, APL, etc.

On the flip side: programming and IT are male-dominated industries, for no very good reason. This results in masculine patterns of behaviour having profound effects and influences.

So, for instance, languages in the Pascal family have safety as a priority and try to protect programmers from errors, possibly by not allowing them to write unsafe code. A typically masculine response to this is to resent the exertion of oppressive control.

Contrastingly, languages in the BCPL/C/C++ family give the programmer extensive control and require considerable discipline and care to write safe code. They allow programmers to make mistakes which safer languages would catch and prevent.

This has a flip side, though: the greater control potentially permits or offers theoretically higher performance.

This aligns with “manly” virtues of using powerful tools — the appeal of chainsaws, fast cars and motorcycles, big powerful engines, even arguably explicitly dangerous things like knives and guns. Cf. Perl, “the Swiss Army chainsaw”.

Thus, the masculine culture around IT has resulted in people favouring these languages. They’re dangerous in unskilled hands. So, get skilled, then you can access the power.

Of course, again, as Dunning Kruger teach us, people cannot assess their own skill, and languages which permit bugs that others would trap have been used very widely for 3 decades or more, often on the argument of performance but actually because of toxic culture. All OSes are written in them; now as a result it is a truism that only these languages are suitable for writing OSes.

(Ignoring the rich history of OSes in safer languages — Algol, Lisp, Oberon, perhaps even Mesa, or Pascal in the early Macs.)

If you want fast code, you need a fast language! And Real Men use C, and you want to be a Real Man, don’t you?

Cf. the story of Mel The Real Programmer.

Do it in something low-level, manage your own memory. Programming is a game for the smart, and you must be smart because you’re a programmer, so you can handle it and you won’t drop a pointer or overflow an array.

Result, decades of complex apps tackling arbitrary complex data — e.g. Web browsers, modern office suites — written in C, and decades of software patching and updating trying to catch the legions of bugs. This is now simply perceived as how software works, as normal.

Additionally, in many cases, any possible performance benefits have long been lost due to large amounts of protective code, of error-checking, in libraries and tools, made necessary by the problems and inherent fragility of the languages.

The rebellion against it is only in the form of niche line-of-business app developers doing narrow, specific stuff, who are moving to modern interpreted languages running on top of tens of million of lines of C written by coders who are only just able to operate at this level of competence and make lots of mistakes.

For people not facing the pressures of commercial releases, there was an era of using safer, more protective compiled languages for in-company apps — Turbo Pascal, Delphi, VB. But that’s fading away now in favour of Java and .NET, “managed” languages running under a VM, with concomitant loss of performance but slight improvement in safety and reliability.

And because this has been widespread for some 2-3 decades, it’s now just _how things are done_. So if someone presents evidence and accounts of vastly better programmer productivity in other tools, decades ago, in things like Lisp or Smalltalk, then these are discounted as irrelevant. Those are not manly languages for manly programmers and so should not be considered. They’re toys.

People in small enough niches continue to use them but have given up evangelising about them. Like Mac users, their comments are dismissed as fanboyism.

So relatively small cultural effects have created immensely strong cultures, dogmas, about what is or isn’t a good choice for certain categories of problem. People outside those categories continue to use some of these languages and tools, while others languish.

This is immensely sad.

For instance, there have been successful hybrid approaches.

OSes written in Pascal derivatives, or in Lisp, or in Smalltalk, now lost to history. As a result, processor design itself has shifted and companies make processors that run C and C-like languages efficiently, and processors that understood richer primitives — lists, or objects — are now historical footnotess.

And languages which attempted to straddle different worlds — such as infix-notation Lisp derivatives, readable and easily learnable by programmers who only know infix-based, imperative languages — e.g. Dylan, PLOT, or CGOL — are again forgotten.

Or languages which developed down different avenues, such as the families of languages based on or derived from Oberon, or APL, or ML. All very niche.

And huge amounts of precious programmer time and effort expended fighting against limited and limiting tools, not well suited to large complex projects, because they simply do not know that there are or were alternatives. These have been crudely airbrushed out, like disappearing Soviet commissars.

“And so successful was this venture that very soon Magrathea itself became the richest planet of all time, and the rest of the galaxy was reduced to abject poverty. And so the system broke down, the empire collapsed, and a long, sullen silence settled over the galaxy, disturbed only by the pen-scratchings of scholars as they laboured into the night over smug little treatises on the value of a planned political economy. In these enlightened days, of course, no one believes a word of it.”

(Douglas Adams)
liam_on_linux: (Default)
A chap on CIX responded to my last piece on Lisp, and it led to a long answer, which my CIX client then crashed and threw away. So if I have to rewrite it, I'll do it here and it will maybe be read by a few more people. Perhaps, ooh, a dozen.

> Trouble is, it comes across a bit as a "lost wisdom of the ancients"
> story.

Yes, it does. But I am OK with that, if I can turn it into a coherent article that tells a comprehensible story.

> Lisp was a niche language in 1960, it's a niche language today. It
> has been a niche language for all the intervening period and I expect it
> to be a niche language for all time to come.

It's a fair point, but there are ways around that. One of the problems, though, is that the Lisp community are very resistant to them.

The thing that my research and my various discussions online are leading me to believe is this:

There are many things about Lisp that used to be distinctive, powerful features decades ago – not merely the functional programming model, but lambda calculus, closures, higher-order functions, tail recursion, lazy evaluation and so on. However, today, other languages can do these things. Perhaps some can do all of them, others only a subset, but that doesn't matter if these are the tools you need to crack your particular problematic nut. And the other languages that include these features do not have the feature that is the biggest problem with Lisp: its obfuscatory syntax, or as the Lisp advocates would have it, its *lack* of syntax. (Of course, as in the case of Perl, for example, they may have their own obfuscatory issues.)

But the problem is that that syntax is both the biggest obstacle to learning and using it, and yet at one and the same time, also absolutely integral to the one feature that sets Lisp apart from pretty much all other languages: its syntactic macros.

Read more... )
liam_on_linux: (Default)
So here's the thought. From things like reading the Unix Hater's Handbook [PDF] and so on, I get this impression that there was a time when Lisp Machines were widely considered by some very smart people to be the ultimate programmer's tool, the best lever for the intellect, as it were.

But they're all dead and gone now.

What I'm wondering is if the Lisp Machine idea could be resurrected on x86 using only Free Software.

There are several components. ISTM that if they could be brought together, they could form the core of a Free LispM OS for COTS x86 boxes.

Read more... )

July 2025

S M T W T F S
  1234 5
6789101112
13141516171819
20212223242526
2728293031  

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 9th, 2025 10:46 am
Powered by Dreamwidth Studios