Discover more from Register Spill
Allergic to Waiting
Is one second fast or slow?
Many times in your day-to-day programming life you have to wait. Wait for your development environment to boot up, wait for the formatting-on-save command to finish, wait for the website you just opened to load, wait for tests to run, wait for the CI build to finish.
The waiting doesn’t really cause me physical pain, but it does evoke a physical reaction alright. I just can’t stand it. Maybe it’s because I’m impatient by nature. Maybe it’s knowing that things could be faster that causes it. When I have to wait ten seconds for a test to finish that I plan to run many times over the next hour, I tell you, it feels as if I’m about to lose my mind.
Then, when I’m down on my knees, screaming at the sky, asking the universe why this is taking so long, asking for salvation, and I look to my left and my right – no one’s there. Everybody’s sitting at their desk, shrugging.
This isn’t meant literally. I work remotely. If I truly would expect someone to scream by my side when I wait for a website to load then, well, then I guess I did lose my mind.
The point is: it sometimes feels as if others don’t care as much. Or maybe they do care, but their pain threshold is higher, their judgement of what’s fast and what’s slow is different than mine. What I perceive as unacceptable isn’t perceived by them at all.
A few months ago I was on a call with a colleague, pairing on something. He had to run a command to re-generate some files. “Do you really want to do this now? This is doing to take like ten minutes” he said. “What the fuck” went through my mind and “Wait, what?” is what I said. I knew that command and I knew that it takes 15 seconds to run on my machine, 30 tops. Ten minutes? That’s too slow. Something must be wrong, I thought, and something was wrong.
I asked him to run the command, because, yes, I need to see this, there’s no way it should take this long. So he ran the command and we looked into what was going on. I’ll keep it short: the command spawned a shell, but my colleague’s shell setup was borked and caused another shell to spawn, which caused another shell to spawn, which… The command that ran in 15 seconds on my machine was a fork bomb on his. It’s still not clear to me how it would ever finish for him.
The even bigger mystery: how he would’ve just shrugged it off if I hadn’t been there.
One theory I have: it might be due to a lack of a mental model. If you don’t know what that command is doing under the hood – finding the tens of files out of a hundred thousand that need to be recreated and running a generation command for each one – then maybe you think ten minutes is a reasonable time to wait for it to finish. There are things can take a long time, ten seconds or ten minutes, without anything being wrong: booting up a computer, downloading large files, re-encoding video files, extracting big archive files, … If you don’t know whether what you’re currently waiting for falls into that category or not, then you just wait, I guess.
But if the lack of a mental model is the reason, then I should see more people wince when a website takes longer than two seconds to load. There are very few reasons most websites should take long to load. Yet many times when, together with colleagues, I’d watch a website that we built load for longer than two seconds and say “something’s off, I bet there’s an N+1 query here” and turn out to be right – nobody else noticed anything.
I’m not a computer performance savant. I can’t put my ear to a machine and guess the clock rate. I just get allergic reactions when I have to wait for a computer for an unreasonable amount of time. So is it maybe that others don’t think that one second can be an unreasonable amount of time for a computer to do something? Is one second is considered fast?
This week I deleted a test in which someone had put a 1-second sleep statement. I ripped out the test for another reason, but seeing that sleep statement… Again: not physical pain per se, but I also didn’t feel comfortable.
Computers can create and destroy entire worlds in one second. One second is multiple billions – billions! – of executed instructions. One second is an eternity for a computer.
Yet I sometimes wonder whether one second is the smallest unit of time most programmers think in. Do they know that you can run entire test suites in 1s and not just a single test? Do they know that one second is slow?
Or do they think one second is acceptable, because my generation of programmers grew up on the Internet? On the Internet Jeff Dean’s Numbers Everyone Should Know seem archaic if not irrelevant: why worry about nanoseconds when pinging the nearest Google server takes 13 milliseconds? On the Internet, one second doesn’t seem too bad maybe
So is it the Internet, with its undersea cables and packets that are sent around the world, that moved the bar for what’s acceptable speed and what not? Maybe.
It’s probably a combination of it all: not knowing what computers are capable of, unsure what a reasonable time for a given task is, and everything going over a network making us think that a single millisecond is not relevant to the work we do.
Thanks for reading. Subscribe here: