To bring Cinematic Mode to the iPhone, we took a careful look at how master filmmakers use rack focus to add drama and emotion to the story.
On Hollywood shoots, focusing requires a team of talented experts. Like a cinematographer, who makes the global call on what’s in the center and when that changes. And a focus extractor, which makes sure the transition is smooth, timing is perfect, and subjects are crisp.
Getting all of this to happen automatically on your iPhone was no easy feat.
We first had to generate high-quality depth data so that Kinematics mode knew the precise distance of people, places, and pets in a scene. And since this is video, we needed this continuous depth data, at 30 frames per second.
We also trained the Neural Engine to work like experts. It makes on-the-fly decisions about what needs to be in focus and applies smooth focus transitions when that changes. If you want creative control, you can always jump into the director’s chair and focus manually, whether you’re shooting or editing.
It’s so computationally intense that we needed a chip that could handle the workload. Enter A15 Bionic.
The computational power required to run machine learning algorithms, render autofocus changes, support manual focus changes, and evaluate every frame in Dolby Vision – all in real time – is astounding.
It’s like having Hollywood in your pocket.