I found this tweet very interesting. Let me quote the main part:
Halfway through this book, I’m constantly amazed by how fast the early id Software guys got things done and games released.
Can’t help but think that modern day SW dev has become bloated, overengineered, and slow, compared to earlier days.
“This book” is John Romero’s memoir Doom Guy and id Software is the company that produced Doom, Quake, Wolfenstein, and other games. I haven’t read the book yet but I have read (and highly recommend) Masters of Doom and, yes, the speed with which id Software shipped games in their prime is remarkable. As the author of the tweet writes in a reply:
Coming up with new tech and shipping that in 4 months. Creating a SNES port in 3 weeks. No off-the-shelf engines available back then.
A SNES port in 3 weeks? No matter what is being ported, 3 weeks for a platform port is fast.
What about the second line, though? Did software development become “bloated, overengineered, and slow”?
Let’s start with that first adjective – bloated. It comes up again and again in various comments on the internet: software has become bloated, which is used to mean that software uses too much memory, it’s slow, inefficient, and, actually, why do we need all that crap.
I’m not sure I buy it. Sure, if we compare CPU speeds from today with those from 1999, the conclusion that things should be a lot faster isn’t even a jump away. I don’t think it’s quite that easy, though. Hardware has gotten better & faster, but the workloads we throw at computers have also grown: our screen resolution isn’t 640x480 anymore, instead we have 120Hz displays and watch 4k movies, instead of text we routinely send screenshots of text around that take up more disk space than all of the images contained in a Windows 2000 installation combined (don’t fact check me). Windows 95 had a 256 color palette, but when I open Slack today there’s a chance I see three high-resolution, 200MB GIFs playing at the same time – of course that uses more memory than paint.exe
. That’s only one dimension – asset sizes – but you get the idea.
This type comparison between old and new technology isn’t software specific. I’ve seen it happen when we built our house three years ago. You’d see how much a house costs and wonder: why are houses so much more expensive than the house my parents built in the 80s? And yet they’re still smaller? Turns out expectations and standards slowly rose over the past 40 years: houses in the 80s didn’t have proper insulation, floor heating, that many windows, that many power sockets, triple-pane windows, Ethernet cables in every room, etc. Quality & prices went up, but it’s hard to notice because most of it is hidden behind the walls or floorboards, and you end up with a comparison that doesn’t make a lot of sense.
Or take phones: a few years ago, non-tech friends of mine would say “a thousand bucks for a phone?! I remember when I could get a phone for a hundred and its battery lasted for five days!” Sure, dude, but did that phone have a megapixel camera with post-processing software that made your DSLR of that time look bad? Was that phone also constantly connected to the internet? Did it have a high-resolution, 120Hz display? Did you also use it for 3 hours every day and did you also use it as a navigation system in your car, your entertainment console at home, your book library, your video library, and as your online-shopping device? That phone from a few years ago is not in same league as the “phone” you use today. They don’t even play in the same stadium.
But I’m starting to digress and I need to wrap this up. Look, I don’t want to brush off the claim that software has become bloated. I think it did, but it’s more complicated than comparing memory usage and required disk space. (Let’s talk about latency instead!)
There are two other adjectives in the tweet – “overengineered, and slow”. These two got me thinking, because yes, the time-to-ship numbers that id Software put up do seem hard to achieve.
I have two thoughts here. The first one is similar to the one about bloat: software has become more complex, which makes shipping take longer, but you don’t necessarily see the complexity, which is why the speed (or lack thereof) seems hard to explain. This is a hunch, a gut feeling, so instead of pointing to clear evidence, let me throw some ideas into the room and wave my arms wildly instead.
Have you ever tried to implement an OAuth authentication flow? Ever compared the sweat that came out of you with the amount of time a user spends thinking about what goes on in the background when they click “Login with …”?
Ever implemented something that works for hundreds of thousands of users at the same time? If so, did you ever hear a user say “wow, I can’t believe that a hundred thousand other people use this at the same time as me, without any problems, that’s magical”? Okay, didn’t think so.
Ever made a website look nearly exactly the same in four different browsers on desktop, on phones, and even gaming consoles, just for one user to ask “why don’t you use this native element here?”
My point is: a lot of complexity is required just to meet today’s baseline expectations and when you meet them a user doesn’t scream out in joy, they just use it. Maybe that’s one reason why software development has become slower?
The other thought I have is even less defined and the one I’m going to leave you with in classic give-’em-something-to-chew-on-and-walk-out fashion.
Here it is: maybe, MAYBE, software development has become slower, because that’s just what happens when you add more people to software projects? And maybe that’s also what happened to the software ecosystem as a whole?
You start out knowing the whole stack of your project. That allows you to move fast because you know where everything is and if you bump into a problem and something over here feels hard and cumbersome, you know that you can change something all the way over there which will turn the problem you’re currently facing from I-need-to-duct-tape-this into Oh-now-it’s-just-a-matter-of-configuration.
But then you realise that won’t scale when you want to add more people, but you also want to add more people, because you want to ship more and faster. So you add another person to your project and say “you worry about this bit, I worry about that bit”. Then, later, when you bump into a similar problem as before, you know your bit, but you can’t make the change all the way over there, because you have no clue how that system works anymore. So you need to wait for your buddy and ask them to change it for you.
You end up not knowing the full stack anymore because adding more people grows the stack in a way that makes it unknowable.
That happens on small projects with abstractions between programmers, it happens on large projects with abstractions between teams, and, I think, it happens in the larger software ecosystem. We build and share libraries that other people and companies and teams can use. We build our software on top of other people’s code that we haven’t looked at once. When you sit on the shoulders of giants – giants that are made up of millions and millions of lines of code – nearly everything becomes a I-need-to-duct-tape-this problem, because you can’t wait for a hundred people to make whatever change you want to make easy.
Does that mean I think speed of software development is a function of programmers knowing the full stack? Well, I guess so. But there’s caveats: what even is the “full stack”? Does it include the OS? The file system? The network? You have to have limits somewhere, no? Even Carmack didn’t write his own OS to ship a game.
I think the most important bit is to know that adding more people and abstractions changes the speed with which you can make change. We all know by now that adding more people to a project doesn’t make it ship faster. But do we all know that adding more people also influences (lowers) the top speed at which software can be developed?