Over the last few weeks, I’ve been listening to the Software Engineering at Google book. I’ve read one or two chapters over the past few years, but never the whole thing.
While listening to the chapters on testing, I’ve had something of a personal lightbulb moment. Or: smile-to-myself moment.
Here, let me set the scene by giving you some excerpts. This is from the the first chapter on testing:
Small tests are the most constrained of the three test sizes. The primary constraint is that small tests must run in a single process. […] The other important constraints on small tests are that they aren’t allowed to sleep, perform I/O operations, or make any other blocking calls. This means that small tests aren’t allowed to access the network or disk.
Later, in the same chapter:
All tests should strive to be hermetic: a test should contain all of the information necessary to set up, execute, and tear down its environment. Tests should assume as little as possible about the outside environment, such as the order in which the tests are run. For example, they should not rely on a shared database.
And this one:
We like to say that “a test should be obvious upon inspection.”
The next chapter is on Unit Testing and contains these paragraphs:
When an engineer refactors the internals of a system without modifying its interface, whether for performance, clarity, or any other reason, the system’s tests shouldn’t need to change
The takeaway is that after you write a test, you shouldn’t need to touch that test again as you refactor the system, fix bugs, or add new features.
A clear test is one whose purpose for existing and reason for failing is immediately clear to the engineer diagnosing a failure. Tests fail to achieve clarity when their reasons for failure aren’t obvious or when it’s difficult to figure out why they were originally written.
Two high-level properties that help tests achieve clarity are completeness and conciseness. A test is complete when its body contains all of the information a reader needs in order to understand how it arrives at its result. A test is concise when it contains no other distracting or irrelevant information.
Alright, the scene is set.
Now, here’s what I was thinking while listening to these paragraphs: this would be a really nice codebase to program in.
No direct mention of non-test, production code and yet after hearing about how it’s tested I was convinced it must be nice to work with.
Needless to say, I’m not a member of the “ugh, you’re changing the code just for the tests?”-club. In fact, I’ll never understand why that club exists.
If a system is easy to test, it’s usually easy to work with — easy to extend, easy to debug, easy to refactor. Maybe because adding tests is a form of extending, debugging, and refactoring a system.
The inverse – a system that’s easy to work with is easy to test – is, at least in my experience, not necessarily true.
Google codebase is probably among the cleanest I've ever worked in. Surely, it depends a lot of the teams and how "legacy" the code was, but for the most part it held true over the years.
The main reason is the readability check when submitting code, but testing for sure contributed to it. When you follow good testing practices, it kinda forces you to think of a code in a cleaner way too.
One note about "All tests should strive to be hermetic: a test should contain all of the information necessary to set up, execute, and tear down its environment." - I think this should be extended to allow test suites, because that's why they were designed. And I can attest from experience that Google did allow test suites.
Definitely sounds like an nice environment in which to code in!
I feel like the difficult part is actually defining public interfaces *within* your code base, so that you can write tests against those interfaces and the implementation is free to change.