The following is an informational look at the most recent firmware update for the Sony a7R III and a7 III, created by PixelShift to help explain the new features and the technology behind it. This document is not an official Sony news item.
For the official Sony information please see this article.
Sony’s Firmware update version 3.0 for the a7 III and a7R III brings a more accurate, immediate and versatile eye-detection AF system to Sony shooters. It includes both Real-Time Eye AF and Real-Time Eye AF for animals, which harness the power of AI and machine learning to usher in a new era of autofocus.
Sony’s cameras are already the industry leaders in eye autofocus. Firmware version 3.0 brings a new level of power and flexibility to photography through improved eye autofocus for both humans and animals.
To better understand the new Real-Time AF, let’s take a look at how eye detection traditionally works and what AI brings to the system.
Tracking an eyeball with a camera is a difficult task. For a camera to find, and follow an eye, it must first figure out what in the scene is or isn’t an eye.
Unlike DSLRs, mirrorless cameras use their sensors as both the focusing system and the imaging system, which allows the camera’s processors to use real-time sensor data to help “look” for objects in the scene without the interruption of a physical mirror getting in the way.
Human brains are hard-wired by evolution to instantly recognize human and animal faces, thanks to some pretty complex pattern recognition and our binocular vision.
Digital cameras have taken a more rudimentary approach to find a subject’s eyes, using the common geometry in faces to help with the task. Find these shapes, and you find a face.
While this works well (and Sony’s eye AF is known to work particularly well), there are still limitations to the performance of eye detect in a standard autofocus system.
The systems aren’t as good at finding eyes if the face is partially obscured, or if the face pointing in such a way as to make it hard to see the other geometric features. When a traditional eye detect system loses sight (pardon the pun) of the eyes, it has to start again and look for a face all over again in order to find a subject’s eyes.
Eye Detection and Artificial Intelligence
The new firmware update for the a7R III and a7 III ushers in a new system for detecting faces and the eyes of a subject, one based on the power of machine learning. With machine learning, a computer AI system is fed a vast amount of data, and the AI figures out how to recognize that data.
Sony has used machine learning for Real-time Eye AF, and the resulting database of information about faces for is used-in camera to more quickly find a subject’s face and eyes.
Real-time AF is based on several interrelated factors—real-time processing speed, the subject data collected through the image sensor, the machine-learning face database, and the camera’s recognition algorithm—which relates the information from the scene to the faces in the database.
The role of the real-time processing speed and the recognition algorithm are at least as important as the machine learning itself. Without the speed and the algorithm, the data wouldn’t be of much use.
The library of information itself isn’t enough to do the job; the processor in the camera must be fast enough to be able to handle that data as well.
Since Sony designs the imaging sensors and the processors in the Alpha cameras, those two key components work together to make for super-fast Eye AF.
Real-Time Eye AF
One of the most powerful AF advances that comes with Real-time Eye AF in Firmware update version 3.0 is that the camera can re-acquire the eyes if they are obscured or if the subject turns away, without first having to acquire a lock on a face. It doesn’t lose focus when the eyes are obscured from view, and so it can pick right back up focusing when the eyes are visible again. The result is an eye detection system that’s much faster than any previous system.
The speed and accuracy of the eye detection in this update now makes it possible to have eye detect AF active any time the shutter is half-pressed, instead of having to use a button to trigger eye detection. Full-time eye detection is a huge workflow advantage since it eliminates the need to let the camera find a face, and then wait for the photographer to press a button to engage eye detection.
Real-Time Eye AF for Animals
If you think it’s hard for a basic focusing system to find a human face and eyes, imagine how hard it is to find the eyeballs on an animal. Between things like faces covered in fur, and features like elongated muzzles (on dogs) or cute button noses (on cats) standard geometric processing just doesn’t cut it. Only Sony’s mirrorless camera system is capable of useful results with these conditions.
Firmware Update version 3.0 also brings with it the first real-time tracking of an animal’s eyes. In this firmware update, the AF works reliably with dogs and cats and future updates will bring recognition and tracking for other wildlife including birds in flight.
The Future of AF
There is no doubt that artificial intelligence is the future of autofocus, and with the new Firmware Update version 3.0 for the Sony a7 III and Sony a7R III, that future has arrived.