The story is even better - Federighi mentioned at The Talk Show Live (https://youtu.be/IcyaadNy9Jk?si=rgADQYfXLda-LAbC, 27:50) that they had a process for 10.1/10.2, where they trialed migrating the file system - they simply ran a consistency check and then rolled it back again to get data on how it worked in the field 😱
I recall the update at the time, but I don't recall if the IOS 10 back then had the Files app. If the Files app was not a thing, that should reduce some of the randomness because all the files are managed by the apps or the OS.
Did they push it to Macs first because the iPhones and iPads?
It's not as bad as it looks. You already have fsck for both filesystems, so integrity checking is solved before you start. It sounds like the migration didn't actually move or touch file data, just metadata. And these systems almost certainly have some mechanism like Android where you have A and B bootloaders, so no matter how badly you screw up you can always boot the last known good config and at least have a recovery environment. And finally, it's Apple so you control HW and only have to support a fixed set of known good configurations. I can honestly see this being a one-man job.
I think I would want more than testing. For something like this, I would want something to *model* the stages of the process at high fidelity, so that each part of the implied state machine could be analyzed. I have not had a great deal of success employing this technique in my professional life, in part because I lack the mathematical foundations that I think are needed to do strong modelling, but nonetheless I feel that it's an underused approach. CPU designers use simulations extensively, but it seems to be quite uncommon outside of that.
The story is even better - Federighi mentioned at The Talk Show Live (https://youtu.be/IcyaadNy9Jk?si=rgADQYfXLda-LAbC, 27:50) that they had a process for 10.1/10.2, where they trialed migrating the file system - they simply ran a consistency check and then rolled it back again to get data on how it worked in the field 😱
Yes! Thank you! Someone else also linked me to this: https://x.com/thorstenball/status/1779562353924755595
I recall the update at the time, but I don't recall if the IOS 10 back then had the Files app. If the Files app was not a thing, that should reduce some of the randomness because all the files are managed by the apps or the OS.
Did they push it to Macs first because the iPhones and iPads?
I actually looked it up: they shipped the update to iOS first and then to macOS.
Nonetheless a great deal of Engineering
I like the last line. Totally didn't see it coming.
It's not as bad as it looks. You already have fsck for both filesystems, so integrity checking is solved before you start. It sounds like the migration didn't actually move or touch file data, just metadata. And these systems almost certainly have some mechanism like Android where you have A and B bootloaders, so no matter how badly you screw up you can always boot the last known good config and at least have a recovery environment. And finally, it's Apple so you control HW and only have to support a fixed set of known good configurations. I can honestly see this being a one-man job.
90% of it is error checking.
I think I would want more than testing. For something like this, I would want something to *model* the stages of the process at high fidelity, so that each part of the implied state machine could be analyzed. I have not had a great deal of success employing this technique in my professional life, in part because I lack the mathematical foundations that I think are needed to do strong modelling, but nonetheless I feel that it's an underused approach. CPU designers use simulations extensively, but it seems to be quite uncommon outside of that.