NewsPhotography

Why Raw video might not be the game-changer you expect

External recorders such as the Atomos Ninja V shown here can encode a Raw video stream into a formats such as Apple’s ProRes RAW.

Raw video has become something of a talking point as more and more cameras have provided a way of capturing or outputting Raw video data streams. But, for a couple of reasons, Raw video doesn’t offer the same benefits that photographers are used to enjoying, when they shoot stills.

Raw video formats

There are multiple types of Raw available and the experience of using each will differ somewhat, but the underlying benefits are broadly similar. RED and ARRI use their own Raw formats, currently only offered on their own cameras. But for non-pro work, it’s much more likely that you’ll encounter CinemaDNG, ProRes RAW or BRaw.

Adobe’s CinemaDNG is an open system based on its stills DNG format, offered natively on the Sigma fp and some external recorders. Atomos has licensed Apple’s ProRes RAW format and BlackMagic Design has created its own BRaw format, both of which allow their external recorders to encode the Raw output streams from a variety of popular cameras.

So, before you add it to your must-have list of features for your next camera (or your list of camera features for brand-loyal point-scoring arguments in the comments), let’s look at why you might not want to dive into Raw shooting just yet.

We’re going to focus on ProRes RAW in this article, since its adoption by Atomos means it’s the Raw video format available to the widest number of camera models, at present. But many of the points are applicable to other formats, to a greater or lesser degree.

1) There’s a lot of extra work to be done

The first time you look at a Raw video file, the first thing you’re likely to notice is how much processing your camera usually does when creating video. For a start, the video is likely to need some re-sizing: both the Panasonic S-series cameras and Sony’s a7S III output roughly 16:9 regions taken from the full width of their sensors, meaning 5.9K or 4.2K, respectively. The Nikon Z6 instead sub-samples its sensor, in order to output an unprocessed UHD signal. Unfortunately, this pixel skipping reduces resolution and introduces moiré, compared with the in-camera 5.9K-derived footage. It also results in noisier footage, because it’s not using all the available pixels.

Once downsized, there’s still a lot of work to be done. You’ll need to find a sharpening and noise reduction strategy that suits your camera’s output, which is likely to require a fair degree of experimentation with different settings and plugins, if you want results that look as clean or sharp as the camera’s own compressed footage. Then there’s the need to reduce and mask any moiré in the footage: something your camera is likely to be doing pretty well without you realizing it.

2) The software support isn’t especially polished

Beyond gaining finer control over the processing applied to your footage, perhaps the two strongest reasons for wanting to shoot Raw are that it should give you greater flexibility over tonal adjustments (ability to brighten footage, adjust curves and exploit dynamic range) and the ability to make big white balance adjustments without detrimental impact on the footage.

Even when they’re available, the Raw adjustment tools in Final Cut Pro are limited and duplicate the function (though not effect) of existing tools elsewhere in the interface.

In the case of ProRes RAW, the tools available to make these adjustments aren’t always particularly polished, despite Apple controlling both the format and Final Cut Pro. Not all cameras appear to communicate enough information in their data streams, meaning the Raw adjustment controls for white balance and brightness (‘ISO’) are not available for some cameras.

For instance in Final Cut Pro these options, when available, are simple pull-downs in the ‘Info’ tab, rather than full tools in the video or color inspector tabs. And, because they’ve been added on later, when they are available their functions overlap with the existing brightness and white balance tools designed for use on gamma-encoded footage. The result is a rather inelegant duplication of options, which have different ways of working and differing effectiveness.

Raw output options are becoming increasingly common. This image shows the Sony a7S III which can output 4.2K Raw at up to 60p.

Our best experiences so far have been with the likes of REDRaw, which can be adjusted using tools designed to match the camera.

3) There’s less potential for improvement.

The other important thing to recognize is that Raw video won’t give as much of an improvement over gamma-encoded, compressed footage as you’d experience when comparing JPEG files to Raw stills.

As well as having more powerful adjustment software available for photography, the difference between the two filetypes used for stills is greater. For photos, the option is typically between 8-bit JPEGs and 12 or 14-bit Raw files, and these JPEGs are designed as final images. As such, they’ve typically had very aggressive S-shaped tone curves applied to them, to give a punchy final result. Looking at a typical JPEG tone curve, we found that around 174 (of 256) data values were committed to the four-stop range centered on middle grey. With just 32% of the available data values available for the other five stops of shadows and highlights, it should come as no surprise that there’s not much scope for adjusting the highlights or exploiting additional dynamic range from the shadows.

This JPEG tone curve, taken from a camera’s standard color mode, devotes around 68% of its values to the two stops on either side of middle grey. As you can imagine, trying to make significant adjustments to the highlights or shadows, squeezed into the remaining values, doesn’t work very well.

But in video the next-best thing to Raw isn’t an 8-bit file with priority given to mid-tones; it’s more likely to be a 10-bit file in which roughly equal importance is given to every tone in the footage, specifically with the intention of preserving the ability to make adjustments. Apple doesn’t give enough information about the ProRes RAW format, meaning we can’t be sure what happens when the 16, 14 or 12-bit linear Raw readout from the camera is encoded into the 12-bits that ProRes RAW uses.

A Log curve (Sony’s S-Log3 in this instance) distributes the available values much more equally. This arrangement and the move from 8 to 10-bit precision maintains a great deal more editing flexibility than the JPEG represented above. Consequently, there’s less of a gain to be had by moving to Raw.

This lack of disclosure makes it difficult to make exact comparisons, but the difference, in terms of how much of the original information is retained between a Log-encoded 10-bit signal and a (presumably) quasi-linear 12-bit capture, is going to be much, much smaller than the difference between JPEGs and Raw stills. This is also why Raw video, with a single 12-bit value for every pixel, isn’t unbearably large by comparison with gamma-encoded footage (which has to retain up to three 10-bit values per pixel, depending on the chroma sub-sampling and group-of-pictures compression scheme being used).

Changing ‘ISO’

RED cameras don’t change in-camera amplification when you change the ISO setting, and just brighten the footage to match your choice of rendering in ‘post. Similarly, some Sony pro video cameras, such as the FX6 and FX9 include a ‘Cine EI’ (EI = Exposure Index) mode, which locks the cameras to one of the two gain steps of their dual gain sensors, again with the expectation that the output can be brightened later, as needed. In principle this should be possible with any ProRes RAW footage but the processing software would need to know how much gain had already been applied to the footage.

We suspect the lack of communication about which ISO mode has been used (especially in Auto ISO mode, where in-camera amplification could change within a single clip) is part of the reason the ‘ISO’ adjustment option isn’t available for all cameras. In principle, once implemented, it would be possible to use the likes of the Z6 II, S1H or a7S III as an ‘EI’ camera, always shooting at a base ISO setting to allow the greatest possible adjustment, later.

Importantly, this means ProRes RAW should still give you lots of flexibility in terms of white balance (so long as the color channels haven’t been rendered and ‘baked’ relative to one another), but you won’t gain the leaps in dynamic range and tonal adjustment that you get with processing Raw stills. In fact your dynamic range may fall if the Raw comes from noisier, sub-sampled output.

Other ‘Raw’ video formats appear to demosaic (resolve the color for each pixel) and chroma sub-sample before recording the file, which would reduce some of this white balance flexibility, too.

In summary

Raw video is clearly going to become more common. With cameras starting to include internal capture, and manufacturers recognizing the benefits of working with companies such as Atomos and BlackMagic Design to offer external capture, it’s only likely to become more accessible.

In turn, the usefulness of Raw video is likely to increase as adoption increases and software makers get better at providing and integrating the tools needed to make the most of it.

And working from Raw still gives you some degree of additional control over:

  • Image processing and noise reduction
  • Brightness, tonal distribution and dynamic range
  • white-balance adjustments

However it’s not going to be as dramatic a difference as you see with stills. And, given there’s already a well-established approach for capturing high-DR, flexible footage and most editing software already provides the tools to make use of it, for now you may find a camera with well-processed 10-bit Log capture is a more practical way to shoot gradable footage.


Author:
Richard Butler
Source: Dpreview

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!