Jonathan Morrison, a videographer, YouTuber, and digital creator, collaborated with singer Julia Wolf to film two music videos on the iPhone 13 using Cinematic mode with no additional equipment. We believe the results speak for themselves!
Both videos were filmed at the golden hour using just an iPhone 13 Pro with Cinematic mode enabled. The videos were filmed without the use of any additional equipment such as artificial lighting, gimbals, camera mounts, and external microphones.
In the video description, Morrison wrote, “Went hands-on with iPhone 13 Pro and immediately wanted to test out the camera and cinematic mode. It’s limited to 1080p 30fps but I was surprised to see how sharp it was and that it retained Dolby Vision.”
The video below is a multi-camera shoot for the music video of Julia’s cover of Ed Sheeran’s “Shivers” with a twist, featuring Aerial View. This was also filmed similarly with an iPhone 13, Apple’s cinematic mode, Dolby Vision color grading, and no additional equipment.
For what they are and how they have been filmed, these videos are proof that with decent editing skills and some spectacular content to film, an iPhone is sufficient to deliver impressive results in the camera department.
However, everything isn’t perfect with this version of Cinematic mode. Not all scenes of the videos use the mode and in those that do, we can see distracting blur around Wolf’s head, hair, and even around armpits. In some cases, the singer’s silhouette also lacks a sharp edge. Apple clearly has some work to do in terms of edge detection and contextual awareness for the scene detection algorithms on the iPhone 13.
A similar sentiment was voiced by Joanna Stern of the Wall Street Journal who uploaded a video review dedicated to the new Cinematic mode. Cutting through the hype, she made it clear that she was disappointed by the poor edge detection. She said,
“With videos, gosh, I was really excited about the new Cinematic mode. Aaaand gosh, was it a letdown. The feature — which you could call “Portrait mode for video” — adds artistic blur around the object in focus. The coolest thing is that you can tap to refocus while you shoot (and even do it afterward in the Photos app).
Except, as you can see in my video, the software struggles to know where objects begin and end. It’s a lot like the early days of Portrait Mode, but it’s worse because now the blur moves and warps. I shot footage where the software lost parts of noses and fingers and struggled with items such as a phone or camera.”
This is indeed reminiscent of the early days of Portrait Mode as a camera feature where edge detection was mostly hit-or-miss. Stern mentioned that Apple executives said Cinematic mode was a “breakthrough innovation that will keep getting better over time.” Well, that better be the case!