I did this work as an earlier part of my EngD and it’s written up in the Elsevier Computers And Graphics Journal (free preprint version). The idea came about from an earlier work
by Klein & Drummond who used motion blur to produce a single frame visual gryroscope. We had a go at extending this to see if you could accurately
determine changes in focal length from motion blur.
In reality, we found that images had to be blurred in such a way as the above picture for acurate results, and practically in VFX work, we don’t see
many frames similar to this that have this kind of blur. However, focal distance does change, often slowly, but on some lenses subtly - and it can make the process of matchmoving harder, especially if the lens introduces a phenomenon known as anamorphic breathing. Here’s a great video of that effect. To this end, VFX companies and lens manafacturers themselves often invest in on set hardware to track the position of the lens barrels - synchronised to the video timecode. These can accurately measure the position of the focal length at any time.
Often though, because of the nature of working on set, timecodes fall out of sync, data storge cards get mislabled, or last minute changes mean it’s impossible to calibrate equipment. Edits before shot turnover can also wipe metadata. When that happens, we were looking for ways in which we might be able to, very quickly, get a rough registration between a stream of data from an encoder and the frames themselves, and we found this method to be a good ‘first resort’ at doing so.