AI & RoboticsNews

Deep Fusion is the iPhone’s take on AI photography

In announcing the iPhones 11 Pro, Phil Schiller tipped us off to a new feature that’ll come to the flagship smartphones in the next year. Deep Fusion is a system which Schiller describes as “computational photography mad science,” which is likely to be the company’s answer, more or less, to Google’s Night Sight.

As Schiller explained, when you’re about to take an image with the new iPhone 11 Pro, the camera will snap 8 images before you press the shutter. When you do, it’ll then take one long exposure, and will then stitch a new image together, “pixel-by-pixel” to create one with lots of detail and very little noise.

It’s not specifically designed for shooting in the dark, but it’s clear that Apple is parking its tanks on Google’s lawn. Night Sight has been one of the strengths of the last few Pixel phones, using machine learning to create well-lit images in dark environments.

Schiller didn’t say when exactly we could expect Deep Fusion, but that it’ll arrive on devices in the coming months. Given the looming announcement of the Pixel 4, it’ll be very interesting to see if Google already has the next generation of its own technology ready to return fire.

Follow all the latest news from Apple’s 2019 iPhone event here!


Author: Daniel Cooper
Source: Engadget
Tags: Apple, cameras, Deep Fusion, gear, iPhone 11, iPhone 11 Pro, iphone2019, Pro


Related posts
NewsSpace

Gravitational waves reveal 'stellar graveyard' packed with neutron star and black hole mergers

NewsSpace

Where to see the blood moon lunar eclipse Sept. 7–8

NewsSpace

US in real danger of losing the moon race to China, experts tell Senate

NewsPhotography

Don't miss out on up to 75% off of some of the best photography books of all time in the Taschen sale!

Sign up for our Newsletter and
stay informed!