Fits on a Floppy
I stumbled across Matt's work via a post on Tildes. His "manifesto for small software" describes how he has targeted building applications that could fit on a 1.44 MB floppy disk. Most of his apps are either Mac OS or iOS, which honestly shocked me that you could bundle apps for those platforms at such a small size.
All this has got me thinking, and when I start thinking I typically end up changing my opinion on things. You see, I agree with Matt, "software has lost its way". My recent post on using Palm OS for weight tracking proves that. The extremely powerful database software I talk about in that post is 758KB, heck you could almost fit two copies on a floppy. For comparison, Numbers (Apple's spreadsheet software on iOS) is 617.2MB. You could fit 833 copies of the Palm OS app in that amount of space!
Here's the thing, it's going to get worse. Much worse. When everything is vibe coded and built on the backs of bloated frameworks, the size of applications will continue to grow. Optimization is an art of the past, and LLM driven development will further solidify it in carbonite. Instead of optimizing for software to better utilize our hardware, we've turned to constantly scaling hardware to fit the software. Buy, buy, buy! At the same time, the price of hardware is skyrocketing, which means it will become increasingly difficult for most to run increasingly bloated software.
I'm sure Microsoft will be happy to rent a cloud server running Copilot OS to you though...for a monthly fee of course.
All that to said, I've changed my mind (again) on using AI. Admittedly I had started to give in due to it being used heavily at work. What I've come to realize is that I don't want to make software that way, it's not meaningful to me. As @eniko said on Mastodon, it's taking the artistry out of coding. The artistry of a well optimized system, of meaningful decisions, of re usability and composition.
I've been reading "Microinteractions: Designing with Details"" by Dan Saffer and it's had me thinking a lot about the details that are getting missed in modern "software development". When you stop optimizing and internalizing every piece of an application, how could you possibly focus on the microinteractions that compose it? The only thing that matters at that point is the list of features used in a sales pitch. The actual experience of using the app is left to Claude to figure out. Heck, the industry is rushing head first into letting AI take over everything human in the UX of applications.
Teams use AI to write the requirements documents. Then use AI to create work tickets. AI is brought in to build the design and user experience. AI writes the code and submits the PR. AI reviews the PR and tests the functionality.
What's the point? You end up using software that had near zero human involvement. Sure, some engineers were needed to drive the AI and keep it on track, and they probably did a cursory glance at the PRs and some level of QA. Maybe. But when so many of the decisions are automated by the machine, what you've created is not something built for users.
So yeah, I'm done letting Claude create anything for me personally. I'll still occasionally use these tools to solve issues, after all they are pattern matching engines which has advantages over simple web searches. But for coding, my opinion is now the same as what I stated in my post on using AI for writing, when you take the human out of the process you're not producing art. And code is art.