@psf@vidak why coerce a LISP machine onto a microprocessor implementing an Abstract C Machine?
the hardware should match closely to the primitives of the high-level LISP language - I'm thinking like the abstract machine defined in the VLISP literature based on Scheme48.
@pixouls crazy how I am seeing job postings that require 6-8 years work experience for what is essentially bottom-of-the-ladder position. they think these people actually exist? :thaenkin:
Think about it. Besides over 200 distros, there are 21 different desktop interfaces and over half-a-dozen different major ways to install software such as the Debian Package Management System (DPKG), Red Hat Package Manager (RPM), Pacman, Zypper, and all too many others. Then there are all the newer containerized ways to install programs including Flatpak, Snap, and AppImage.
I can barely keep them all straight and that's part of my job! How can you expect ordinary users to make sense of it all? You can't.
I find it important that he says here and in his book The Invisible Computer (1999) that we must start from scratch:
#Unix was designed for the computing environment of then, not the machines of today. Unix survives only because everyone else has done so badly. There were many valuable things to be learned from Unix: how come nobody learned them and then did better? Started from scratch and produced a really superior, modern, graphical operating system?
I love that Don Norman gave the foreword for the UNIX-Haters Handbook (1994) and he cites his own paper from 1981:
Norman, D. A. The Trouble with Unix: The User Interface is Horrid. Datamation, 27 (12) 1981, November. pp. 139-150. Reprinted in Pylyshyn, Z. W., & Bannon, L. J., eds. Perspectives on the Computer Revolution, 2nd revised edition, Hillsdale, NJ, Ablex, 1989.
@drwho I'm not fully versed on the history, but one of the responses in this thread mentions that separate, earlier issue of the physical problem with the floor mats.
This article mentions and links the NASA/NHTSA investigation. This related article says more:
Neither NHTSA, with its absence of software expertise, nor the NASA Engineering and Safety Center β to which NHTSA turned to study the Toyota problem β were able to pinpoint a software cause for unintended acceleration. Nor were they able to rule out the possibility.
The NASA researchers, who were both on a deadline and not allowed to study Toyota's source code, simply ran out of time, noted Barr.
@mupan I read somewhere that modern automobiles have over 300 computers in them! what a complexity nightmare. but not to worry, because you get to have Alexa in your infotainment system-
@mupan this litigation happened in 2013. and I still don't see regulators requiring software audits before automobile makers push Over-The-Air (!!!) updates to customer vehicles.
just recently there was a report of a Tesla vehicle failing to respond at 83 mph on the freeway. that guy got lucky.
@mupan and the same applies to medical devices. but corporate lawyers and lobbyists are still able to convince legislators and judges that releasing their source code hurts their business.