MobileNews

iPhone 13 Diary: Cinematic Video for amateur filmmaking put to the test

I said last time that I wanted to do a proper test of the potential of the iPhone 13‘s Cinematic Video for amateur filmmaking. My initial experiments had suggested that it wasn’t yet ready for prime-time, but those had been in mixed conditions.

I wanted to conduct a fair test, with the scene and lighting setup exactly as it would be by a typical low-budget hobbyist filmmaker. I did this by borrowing a couple of actors and filming the first scene of a (very much first draft!) pilot episode of a sitcom …

Setup

I used just one light, a Godox SL-60W, which is a budget video light popular with amateur filmmakers. In true frugal filmmaker style, I added a low-cost honeycomb grid to control the spread of the light and a folded piece of white net material to soften it. A piece of wood was used as a ‘flag’ – to further block spill from the light.

The second light was a ‘practical’ light – that is, a household floor lamp which turned out to be perfect for lighting the second actor. This had a Philips Hue bulb to allow me to control the brightness.

Finally, I used a low-cost tripod adapter to mount the iPhone. The total cost of the setup was under $200, so a realistic one for a young person wanting to take their first steps into filmmaking – though we will see one exception to this, in the bad news section.

The sample footage

Let’s start by taking a look at the result before we talk about it. Best watched in full-screen to see all the glitches, and make sure quality is 1080p.

I wanted this to give a realistic look at what raw footage will look like, so all the focus changes are exactly as shot (though note the problematic timing issue below). There is no color correction or grading – I just converted from HDR to SDR (REC 709) – so this is essentially straight from camera.

The bad news

There are a number of downsides to Cinematic Video on the iPhone 13. Spoiler alert: The biggest of these is that it’s simply not yet good enough – but we’ll get to that in the conclusion. In the meantime, let’s address some of the specific weaknesses.

It’s very common in filmmaking to want some shots to have rather low lighting. This can create drama, and – like selective focus – provide another means of removing distracting elements from the frame, ensuring that all the attention is on the actors.

With cinema cameras, this is not usually an issue – you can set the exposure accordingly. However, the iPhone 13 initially complained that the light was too low for Cinematic Video mode, and I had to boost it a little higher than I would like.

I mentioned last time that the AI-powered focus-selection feature in Cinematic Video was impressive in a technical sense, but didn’t really appear ready for use.

I’d have to say this is very rough and ready. There’s whatever the artificial equivalent of focus-hunting is, and there are some glitches and artifacts.

This test in controlled conditions absolutely confirmed that amateur filmmakers will not want to use this feature in its current form. I found that it did some very random focus switches. After an initial test, I switched entirely to manual focus control.

As with my informal tests, I couldn’t find any way to completely disable AI-powered focus shifting, which means that single-tapping for a focus change isn’t reliable: you need to double-tap to enable focus-tracking mode.

This creates a big problem …

One huge limitation I found is that you cannot realistically use the built-in iPhone microphones.

This is not because they aren’t effective – indeed, they proved remarkably good in these controlled conditions – but rather that the double-tapping necessary for manual focus changes is loud enough to be picked up by the internal mics.

I guess that’s inevitable: a reliable tap has to be quite firm, and the microphones are right next to the screen, so if you want the sound levels to be high enough to record someone 10-12 feet away from the camera, then screen taps are going to sound loud.

So in reality, while the video only needed <$200 worth of additional kit, I had to use a separate sound recorder costing the same amount again. I then synced the separate audio and video tracks in Final Cut Pro (there’s an automated feature which means this is completely painless).

That said, spending money on audio kits should be expected by any filmmaker at any budget level. When I first started researching filmmaking, all the advice was that audio is at least as important as video – and some might say more important. People will watch low-quality video if they can clearly hear the dialogue, but not the reverse. So it’s not unreasonable to expect even iPhone filmmakers to spend some cash on at least a sound recorder. If I were doing this properly, I’d also want to use lapel mics.

With a normal camera, you would typically hold the focus on the character speaking, then move the focus to the second character. (Of course, there are plenty of exceptions, where the reaction is what we want to see, but this would be the default). I timed my focus changes in this way. However, rather than begin the focus racking at the tap, the iPhone finishes it at this point.

This means that the focus begins to shift before you tap to change it – which can be seen in the sample footage.

You also don’t get to control the speed of the focus change. You might want to do a very quick change sometimes, and a more relaxed one at another time to suit the mood of the scene or action, but you don’t have that degree of control.

The good news

There are a few pieces of good news.

What I generally like to do when filming is use breaks to transfer footage to my Mac. That achieves two things. First, it means I have an immediate backup of the video. Second, myself and the actors can then review the footage on a large screen.

I attempted this using Image Capture, thinking that a cable transfer may be better, but that gave me a ‘flat’ video file, and two sidecar files that were ignored when I imported into the Photos app on the Mac.

For this reason, we ended up reviewing the footage on the iPhone itself, which was less than ideal.

However, I subsequently discovered that if you instead AirDrop the footage, then you get the version with the selective focus.

This corollary of this is that it’s easy to get both selective focus versions of the same clip. AirDrop for selective focus, transfer with Image Capture for the ‘flat’ version.

This was great in these particular circumstances, as the deal with the actors was that they would get material for their showreels – and frankly, the Cinematic Video versions aren’t good enough for this. But even when it improves, some actors might want to be in focus the whole time, so we see their reactions as well as them delivering dialogue, so this is really handy.

There is still some depth of field in the standard version of the video, and the focus is controlled in the same way, so you effectively get both an artificial and natural version of the footage. However, the small sensor of a smartphone compared to a dedicated camera means that this effect is rather subtle.

Real depth of field control from a behind-the-scenes shot

Cinematic Video for amateur filmmaking – Conclusions

The TL;DR is that it’s simply not yet good enough.

From a technical/gadget perspective, I remain blown away at what this feature can do. It’s incredible that it can do this kind of image processing in real-time at 30fps. The fact that all the focus changes are editable also means you can correct errors and make artistic changes later, which reduces the pressure when filming.

However, the hard truth is that amateur filmmakers are not going to be using this just yet. It’s clear that Apple’s incredibly impressive-looking footage was shot in uber-controlled conditions, and that the company selected only clips in which the feature performed flawlessly. I suspect there was a huge amount of trial-and-error involved, and that there was a ton of flawed footage left on the virtual editing room floor.

Even though I gave the phone a simple background with good contrast, it still got confused. Watch Anton’s nose in particular in the video! You can also see a number of other areas in which it struggled to work out what should be in and out of focus, like Laetitia’s feet and Anton’s phone.

Apple really should have called this a Beta. I think it didn’t because it wanted a headline feature for the iPhone 13, and this is it. But, it is in truth an early beta version.

It tries very hard. It’s technically impressive. But it’s just nowhere good enough for real-life use.

However… the fact that it’s a beta version by any other name is also good news. I can still remember how bad the first version of Portrait Mode was, and how comparatively good it is now. We’ll of course see similar progress with Cinematic Video.

We will, one day, see amateur filmmakers shooting video with selective focus on their iPhone <Guess the number here>. Perhaps professional ones too, using it as a B or C camera. Just… not for a while yet.

Actors: Anton Tweedale and Laetitia Coulson. Screenplay: Ben Lovejoy.
Stills: CK Golding. With thanks to Jeff Benjamin for FCP assistance.


Check out 9to5Mac on YouTube for more Apple news:

Check out the latest Apple iPhones at great prices from Gizmofashion – our recommended retail partner.


Author: Ben Lovejoy
Source: 9TO5Google

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!