In particular, the ability to adjust the focus points in editing makes it an extremely powerful tool. Honestly, for manual mode in particular, I’m excited…
Some of my tests were of Steph during tango lessons with the amazing Diego Bado, and getting authorization to share dancing videos could be challenging, but she did agree to one clip of her receiving feedback.
Cinematic Video test – automatic
With this first clip, I decided to see what the AI system would do if left entirely to its own devices. I literally just put the camera into Cinematic Video mode, started recording, and let the app make all the decisions.
You can see this if you look at the video in edit mode. White dots indicate automatic focus changes chosen by the AI, while yellow circles indicate manual selections. (The yellow manual one is just after this clip.)
The AI decision-making result was impressive. Diego was mostly facing the camera, so for most of the time it chose him as the subject. The only time this changed was when I walked behind Steph, blocking Diego from view.
It did stick with her a bit longer than I would have expected, given she was facing away from the camera, but that would be an artistic judgement if a human were making the decision, so it’s interesting rather than wrong. Perhaps it tries to avoid rapid switches.
Let’s take a look:
I’d have to say this is very rough and ready. There’s whatever the artificial equivalent of focus-hunting is, and there are some glitches and artifacts.
As we’ve seen with other samples, the focus system struggles with small gaps between arms and bodies, and has some of the background in focus. Compare these two stills, looking at the gap inside Steph’s arm. All is good at first:
Then as the gap decreases in size, suddenly we have an element of the background in focus:
This was, however, a fraction of a second and I honestly don’t think I’d have noticed it had I not gone specifically looking for it frame by frame.
Overall, I’d have to give this a 5/10. It’s nothing like good enough for real work, but I do think many consumers are already going to find this impressive.
Cinematic Video test – manual
This second test was with me manually choosing the focus points. I haven’t yet figured out how to completely prevent it from doing automatic ones, so that’s something I need to play with. You can see that most changes are yellow, meaning I made them, while there are still white (automatic) ones too.
Let’s take a look:
This, I think, is significantly better.
Not perfect, of course. There are those unwanted automatic focus changes, indicated by the white dots, and sometimes the focus isn’t quite right, but I’d have to say the overall result is really impressive – especially in such low light, where a conventional autofocus system would really struggle.
I’m sure there’s a simple way to avoid/remove the automatic changes (the only way I can find so far is to effectively overwrite them with manual ones, but that gets fiddly on an iPhone sized screen – even the Pro Max). Editing will of course be much easier when we can do it on the Mac.
There was also a brief period of a few seconds where it refused to do face-tracking, and instead could only offer me a wide-area focus, despite the indicator showing that it could detect the face:
So the focus here was more on a section of the guitar than Martin’s face.
But let’s put these quibbles into context. First, this is a first-generation version of the technology – really, a public beta. It’s clear that Apple knows there are glitches, and I’m equally sure the results will quickly improve, exactly as we’ve already seen with Portrait mode stills.
Second, this is a smartphone! Not only that, but in the first example I showed, the results were completely automatic! Anyone can pick up the iPhone 13 Pro and start getting what are already good enough results for consumers by doing nothing more then selecting the mode and pressing the record button. Honestly, that blows me away.
Could it be used by amateur filmmakers? I’m planning to put that to the test. I’m going to film the first few minutes of a screenplay I’m writing (memo to self: Insert lots of “first draft” disclaimers in that piece!). I’ll set up the scene, light it, exactly as I would if using proper filmmaking kit, and then see how the camera does in both automatic and manual modes.
My guess is that it won’t be good enough. I’m expecting some distracting glitches, especially in automatic mode. But I do think Apple will get there within a year or two, maybe earlier. So, watch this space for a more controlled test in a week or two.
Have you been doing your own Cinematic Video tests? Please share your thoughts and experiences in the comments, and link to any examples (put a space in the link to avoid Disqus spam trapping).