For sure, it’s a lot easier to do a lot of stuff today than before, but the way we build software has become incredibly wasteful as well. Also worth noting that some of the workflows that were available in languages like CL or Smalltalk back in the 80s are superior to what most languages offer today. It hasn’t been strictly progress in every regard.
I’d say the issue isn’t that programmers are worse today, but that the trends in the industry select for things that work just well enough, and that’s how we end up with stuff like Electron.
Also worth noting that some of the workflows that were available in languages like CL or Smalltalk back in the 80s are superior to what most languages offer today.
In what ways? I don’t have any experience with those so I’m curious.
Common Lisp and Smalltalk provided live development environment where you could run any code as you write it in the context of your application. Even the whole Lisp OS was modifiable at runtime, you could just open code for any running application or even the OS itself, make changes on the fly, and see them reflected. A fun run through Symbolics Lisp Machine here https://www.youtube.com/watch?v=o4-YnLpLgtk
Here are some highlights.
The system was fully introspective and self-documenting. The entire OS and development environment was written in Lisp, allowing deep runtime inspection and modification. Every function, variable, or object could be inspected, traced, or redefined at runtime without restarting. Modern IDEs provide some introspection (e.g., via debuggers or REPLs), but not at the same pervasive level.
You had dynamic code editing & debugging. Functions could be redefined while running, even in the middle of execution (e.g., fixing a bug in a running server). You had the ability to attach “before,” “after,” or “around” hooks to any function dynamically.
The condition system in CL provided advanced error handling with restarts allowed interactive recovery from errors (far beyond modern exception handling).
Dynamic Window System UI elements were live Lisp objects that could be inspected and modified interactively. Objects could be inspected and edited in structured ways (e.g., modifying a list or hash table directly in the inspector). Modern IDEs lack this level of direct interactivity with live objects.
You had persistent image-based development where the entire system state (including running programs, open files, and debug sessions) could be saved to an image and resumed later. This is similar to Smalltalk images, but unlike modern IDEs where state is usually lost on restart.
You had knowledge-level documentation with Document Examiner (DOCX) which was hypertext-like documentation system where every function, variable, or concept was richly cross-linked. The system could also generate documentation from source code and comments dynamically. Modern tools such as Doxygen are less integrated and interactive.
CL had ephemeral GC that provided real-time garbage collection with minimal pauses. Weak references and finalizers are more sophisticated than most modern GC implementations. Modern languages (e.g., Java, Go, C#) have good GC but lack the fine-grained control of Lisp Machines.
Transparent Remote Procedure Calls (RPC) allowed Objects to seamlessly interact across machines as if they were local. Meanwhile NFS-like but Lisp-native file system allowed files to be accessed and edited remotely with versioning.
Finally, compilers like Zeta-C) could compile Lisp to efficient machine code with deep optimizations.
I had access to a Symbolics machine back in the day, but I was too young & dumb to understand or appreciate what I had my hands on. Wasted opportunity 😔
It’s really impressive to think what was achieved with such limited hardware compared to today’s standards. While languages like Clojure are rediscovering these concepts, it feels like we took a significant detour along the way.
I suspect this has historical roots. In the 1980s, Lisp was primarily used in universities and a small number of companies due to the then-high hardware demands for features like garbage collection, which we now consider commonplace. Meanwhile, people who could afford personal computers were constrained by very basic hardware, making languages such as C or Fortran a practical choice. Consequently, the vast majority of developers lacked exposure to alternative paradigms. As these devs entered industry and academia, they naturally taught programming based on their own experiences. Hence why the syntax and semantics of most mainstream languages can be traced back to C.
Unfortunately, the lisp machine didn’t gain traction because the start-up times were so long and I believe this is due to it doing lots of internal checks which was awesome but unfortunately things like the Sun SPARCstation won because it was crazy fast although buggy
For sure, it’s a lot easier to do a lot of stuff today than before, but the way we build software has become incredibly wasteful as well. Also worth noting that some of the workflows that were available in languages like CL or Smalltalk back in the 80s are superior to what most languages offer today. It hasn’t been strictly progress in every regard.
I’d say the issue isn’t that programmers are worse today, but that the trends in the industry select for things that work just well enough, and that’s how we end up with stuff like Electron.
In what ways? I don’t have any experience with those so I’m curious.
Common Lisp and Smalltalk provided live development environment where you could run any code as you write it in the context of your application. Even the whole Lisp OS was modifiable at runtime, you could just open code for any running application or even the OS itself, make changes on the fly, and see them reflected. A fun run through Symbolics Lisp Machine here https://www.youtube.com/watch?v=o4-YnLpLgtk
Here are some highlights.
The system was fully introspective and self-documenting. The entire OS and development environment was written in Lisp, allowing deep runtime inspection and modification. Every function, variable, or object could be inspected, traced, or redefined at runtime without restarting. Modern IDEs provide some introspection (e.g., via debuggers or REPLs), but not at the same pervasive level.
You had dynamic code editing & debugging. Functions could be redefined while running, even in the middle of execution (e.g., fixing a bug in a running server). You had the ability to attach “before,” “after,” or “around” hooks to any function dynamically.
The condition system in CL provided advanced error handling with restarts allowed interactive recovery from errors (far beyond modern exception handling).
Dynamic Window System UI elements were live Lisp objects that could be inspected and modified interactively. Objects could be inspected and edited in structured ways (e.g., modifying a list or hash table directly in the inspector). Modern IDEs lack this level of direct interactivity with live objects.
You had persistent image-based development where the entire system state (including running programs, open files, and debug sessions) could be saved to an image and resumed later. This is similar to Smalltalk images, but unlike modern IDEs where state is usually lost on restart.
You had knowledge-level documentation with Document Examiner (DOCX) which was hypertext-like documentation system where every function, variable, or concept was richly cross-linked. The system could also generate documentation from source code and comments dynamically. Modern tools such as Doxygen are less integrated and interactive.
CL had ephemeral GC that provided real-time garbage collection with minimal pauses. Weak references and finalizers are more sophisticated than most modern GC implementations. Modern languages (e.g., Java, Go, C#) have good GC but lack the fine-grained control of Lisp Machines.
Transparent Remote Procedure Calls (RPC) allowed Objects to seamlessly interact across machines as if they were local. Meanwhile NFS-like but Lisp-native file system allowed files to be accessed and edited remotely with versioning.
Finally, compilers like Zeta-C) could compile Lisp to efficient machine code with deep optimizations.
I had access to a Symbolics machine back in the day, but I was too young & dumb to understand or appreciate what I had my hands on. Wasted opportunity 😔
It’s like an artifact from an ancient and more advanced civilization. :)
No wonder there are some older developers who defend Lisp so passionately. Sounds like a dream to work with once you got the hang of it.
It’s really impressive to think what was achieved with such limited hardware compared to today’s standards. While languages like Clojure are rediscovering these concepts, it feels like we took a significant detour along the way.
I suspect this has historical roots. In the 1980s, Lisp was primarily used in universities and a small number of companies due to the then-high hardware demands for features like garbage collection, which we now consider commonplace. Meanwhile, people who could afford personal computers were constrained by very basic hardware, making languages such as C or Fortran a practical choice. Consequently, the vast majority of developers lacked exposure to alternative paradigms. As these devs entered industry and academia, they naturally taught programming based on their own experiences. Hence why the syntax and semantics of most mainstream languages can be traced back to C.
Interesting! Thank you!
Unfortunately, the lisp machine didn’t gain traction because the start-up times were so long and I believe this is due to it doing lots of internal checks which was awesome but unfortunately things like the Sun SPARCstation won because it was crazy fast although buggy
It was basically way ahead of its time.
It really was. I forgot to mention in my comment that the sun machines were also really cheap so, you know, capitalism.