Apple’s launch events have a signature style to it and the iPhone 13 event was no different. The company kept harping about Cinematic mode on iPhone 13. Meanwhile, the Cinematic Mode has received mix reactions, however some are questioning its utility. Now an Apple executive has expalined how the Cinematic Mode works and the thought process behind the feature.
Kaiann Drance, VP, iPhone Product Marketing spoke with TechCrunch at length about the thought process behind Cinematic Mode.
We knew that bringing a high quality depth of field to video would be magnitudes more challenging [than Portrait Mode],” says Drance. “Unlike photos, video is designed to move as the person filming, including hand shake. And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.
Apple executive explains how Cinematic Mode heavily relies on A15 Bionic and Neural Engine. The Cinematic Mode is also available in Dolby Vision HDR and the encoding required Neural Engine. It is usually tricky to add live preview mode for Portrait mode, but Apple has been doing it all along.
Birth of Cinematic Mode
Apple says it didnt start out with an idea of creating Cinematic mode. The team brainstormed about filmmaking and started exploring related avenues. Next up the team researched cinematography techniques and ways to implement realistic focus transitions. Apple took inspiration from the classic techniques and added a modern touch.
The design process involved studying works of portrait artists like Avedon and Warhol. In some cases the team visited to examine the original pieces and used similar process to develop Cinematic Mode.
We knew that bringing a high quality depth of field to video would be magnitudes more challenging [than Portrait Mode],” says Drance. “Unlike photos, video is designed to move as the person filming, including hand shake. And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.
Apple worked closely with directors of photography, camera operators and others who are involved in focus pulling. This helped the team deal with intricacies of designing a portrait mode for video.
What Exactly is Cinematic Mode?
The Cinematic Mode on iPhone 13 Pro uses the signals from acceleometer to judge whether you will move toward or away from the subject. You can get an idea of how good/bad cinematic mode is by checking these music videos that were shot entirely on iPhone 13.
[via TechCrunch]