MobileNews

Apple explains how it created the iPhone 13’s Cinematic Mode

One of the greatest additions to the iPhone 13 line is the new Cinematic Mode. Reviewers have mixed feelings over whether it’s only a gimmick or a revolution in mobile cameras, although all of them have been impressed about this first-gen feature.

Now, Apple VP Kaiann Drance and Human Interface Team designer Johnnie Manzari explain how they built Cinematic Mode for the iPhone 13, where this idea came from, and how the A15 Bionic is a fundamental part of the process.

In an interview with TechCrunch‘s Matthew Panzarino, both Drance and Manzari explain that the A15 Bionic and Neural Engine are heavily used in Cinematic Mode and they wanted to create something that would be encoded with Dolby Vision HDR without sacrificing live preview.

“We knew that bringing a high-quality depth of field to video would be magnitudes more challenging [than Portrait Mode],” says Drance. “Unlike photos, video is designed to move as the person filming, including hand shake. And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.”

Manzari explains that the concept of Cinematic Mode didn’t start with the feature itself and at Apple, it’s typically the opposite of what happens inside of this design team:

“We didn’t have an idea [for Cinematic Mode]. We were just curious — what is it about filmmaking that’s been timeless? And that kind of leads down this interesting road and then we started to learn more and talk more… with people across the company that can help us solve these problems.”

“When you look at the design process,” says Manzari, “we begin with a deep reverence and respect for image and filmmaking through history. We’re fascinated with questions like what principles of image and filmmaking are timeless? What craft has endured culturally and why?”

For example, when Apple started developing the Portrait Lighting feature, which became available starting with the iPhone X, Apple’s design team went on an exploration of classic portrait artists. In many cases, going to visit the original pieces and breaking down those characteristics in the lab. A similar approach was taken while developing Cinematic Mode, and they talked a lot with the best cinematographers and camera operators in the world.

“In doing this, certain trends emerge,” says Manzari. It was obvious that focus and focus changes were fundamental storytelling tools, and that we as a cross functional team needed to understand precisely how and when they were used.”

One of the main goals with Cinematic Mode is to transform something only skilled professionals could do and make it simple for everyone. Manzari explains:

“We feel like this is the kind of thing that Apple tackles the best. To take something difficult and conventionally hard to learn, and then turn it into something, automatic and simple.”

All iPhone 13 models

In the story, TechCrunch also explains what Cinematic Mode is and how it works. This technology is “a bundle of functions that exist in a new section of the camera app,” utilizing the CPU, GPU, and Apple’s Neural Engine for machine learning work, accelerometers for tracking and motion, and the upgraded wide-angle lens and stabilized sensor. It uses:

  • Subject recognition and tracking
  • Focus locking
  • Rack focusing (moving focus from one subject to another in an organic-looking way) 
  • Image overscan and in camera stabilization
  • Synthetic Bokeh (lens blur)
  • A post-shot editing mode that lets you alter your focus points even after shooting

To learn more about Cinematic Mode and see what tests Matthew Panzarino conducted while on Disneyland, click here.

Related:


Check out 9to5Mac on YouTube for more Apple news:

Check out the latest Apple iPhones at great prices from Gizmofashion – our recommended retail partner.


Author: José Adorno
Source: 9TO5Google

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!