September 19, 2022 – Josh

Photo by my uncle 🙂
Although I was never around to experience it, I remember a time when the notion of computing existed for you. Computing for entertainment. In a sense we are all personally computing constantly, but that’s not what I mean.
There is magic in computers and with the help of programs we can hope to overcome that magic for a moment. Using the graphical interface is not detailed enough, it’s like using other people’s spells instead of writing your own. So now we’re in this weird place where computer use is either (1) this half-baked consumerist practice, or (2) engineering. There is no room for software like soap bubbles, it is a non-existent category.
Basically what I’m trying to say is that there is no line between professional and amateur as a programmer. We force ourselves to write only the most robust and scalable programs, even if it’s just for us. best practices! We use industrial equipment, the same infrastructure we use at work and at home to power the world.
I don’t remember where I read it, but someone pointed out the huge disparity between professional and amateur in arts like film, such as in the equipment available. Professional cameras are huge machines, no amateur uses them at home to make videos (not to mention other equipment like high-end microphones and lighting equipment). In a way it’s similar to a music studio (all the mixing gear and synthesizers they have…), but disrupted by software (DAWs and VSTs). Despite this, there is a tangible difference in both the experience of creating these media (e.g. a band of guys in a garage versus a graybeard audio engineer in a studio) and the feeling of the outcome (e.g. a funny cat video versus a Wong Kar Wai film). Individuals (amateur) and guilds (professional) have traditionally been almost completely separate things, only recently has computing begun to disrupt this arrangement (is this what our post-scarcity world will look like?). But back to computer science specifically…
Is this a good or bad thing? In a way we could portray this as a triumph of computing, the little guy has the same means of production as the big guy, we are all on an equal playing field, etc. But on the other hand the little boy is now a corporate. We have the same constraints as the industry. This is no longer for entertainment, we are constantly being monitored, God forbid we do not follow best practices.
I’m not saying we should write Spaghetti code at home as if it doesn’t matter, not letting yourself slip into writing BadCodeTM is probably part of a healthy breakfast. Rather I’m saying that priorities at home, for entertainment, for magic, for personal computing, are different. You don’t want strong stable silos, cathedrals, pyramids. You want dynamic silk, a spider builds its beautiful web at night and packs it up in the morning, only to do it again the next day. You see, it can’t be too difficult. (Forgive my sad attempt at describing it poetically).
I always see people saying that as an industry we collectively agree that metaprogramming is a bad idea. Well, after looking at some older codebases in detail I can understand why you would want to make the code as simple as possible. It’s optimization for readability, reliability, blah. But why should the technology I use to make my computer magical to me, when I’m programming for myself at home, be the same as what I use on a team full of people who have to maintain this thing for years to come? Metaprogramming makes perfect sense for personal computing. This type of programming should be fun, it can be funny, it can be whatever we want. It should be free. The capitalist ethos of “whatever you do must be productive in some way” is ingrained.
Software can be a soap bubble. This could just happen to us. This may be just for fun. Write one to throw. No one should see this!! Oh, the sigh of relief I’ll breathe the first time I do something for no reason. Freedom!!!!
I am on the Graff mailing list. Liked Brandon Robinson’s attitude:
> Do you have a git repo in a public remote that I can look at?Sorry, no–it just lives in my home directory without any replication on the Internet.
I know some people put their private dotfile repos on the public web, but I’m neither that showy, nor proud of some of the bugs I’ve encountered, nor willing to provide support services for such things.
I think in a way the rise of open source has contributed to the death of personal computing because as we somewhat converge on the infrastructure of our society (base open source programs that everyone uses, across industry and individually), the distinction between professional and hobbyist becomes less common. We’re probably timesharing the same machines for Pete’s sake, the cloud doesn’t discriminate.
I’m not sure what conclusion to draw from these thoughts. On one hand, affordable computing is helping us make progress in class warfare and putting power at the fingertips of every person; Every year more means of production become available in our phones or online for free and for free. But as a wise stock proverb once said, with great power comes great responsibility, and it is our responsibility not to forget how to survive. Meaning, we should use this technology as a medium for our soul, not as a replacement for it. Hard work to make something beautiful in a limited environment cannot be completely replaced by sufficiently advanced technology.
I’m actually somewhat worried that the human race will become like the people in Wall-E. We already get food almost on demand, even entertainment for consumption. For now, a living person has to produce that entertainment, but once we figure out how to automate it (ComingSoonToATheatreNearYou) we’ll be entertaining ourselves until death.
I want to read Computer Lib by Ted Nelson, I think it will answer some of my questions and give me new questions.
computing-related footnote
I couldn’t help turning this into another existential tangent, but the basic seed of this post was that I was really looking for a way to use my computer. The ongoing conversation about static versus dynamic languages is relevant here:
As Paul Graham points out, the difference between Lisp and Java is that Lisp is for working with computational ideas and expressions, while Java is for expressing complete programs. As James says, Java requires you to make decisions from the beginning. And once pinned down, the system that is the set of type declarations, the compiler and the runtime system makes it that much harder for you to change those assumptions, on the assumption that all such changes are mistakes you’re making unknowingly.
[…] Our poor approach to software development is due to what we have done to programming languages. With few exceptions, we have chosen to optimize the details of programs for compilers, computers…. Interestingly, the consequences of this optimization are well described by “Premature optimization is the root of all evil in programming”. [Knuth] Was referring to the practice of worrying about the performance of an algorithm before worrying about correctness, but this saying can be taken to refer to any design problem where optimization is an ultimate concern. In this case, the design problem was to design a usable programming medium that would be excellent at enabling developers and designers to search and explore, and capable of enabling search and exploration once completed. Rather than wait until we understood the implications of large system design and implementation using computer programming media, we decided to optimize ahead of time for performance and optimization. And instead we got program description (or programming) languages – the root of all evil.
<a href