AI & RoboticsNews

Deep Fusion is the iPhone’s take on AI photography

In announcing the iPhones 11 Pro, Phil Schiller tipped us off to a new feature that’ll come to the flagship smartphones in the next year. Deep Fusion is a system which Schiller describes as “computational photography mad science,” which is likely to be the company’s answer, more or less, to Google’s Night Sight.

As Schiller explained, when you’re about to take an image with the new iPhone 11 Pro, the camera will snap 8 images before you press the shutter. When you do, it’ll then take one long exposure, and will then stitch a new image together, “pixel-by-pixel” to create one with lots of detail and very little noise.

It’s not specifically designed for shooting in the dark, but it’s clear that Apple is parking its tanks on Google’s lawn. Night Sight has been one of the strengths of the last few Pixel phones, using machine learning to create well-lit images in dark environments.

Schiller didn’t say when exactly we could expect Deep Fusion, but that it’ll arrive on devices in the coming months. Given the looming announcement of the Pixel 4, it’ll be very interesting to see if Google already has the next generation of its own technology ready to return fire.

Follow all the latest news from Apple’s 2019 iPhone event here!


Author: Daniel Cooper
Source: Engadget
Tags: Apple, cameras, Deep Fusion, gear, iPhone 11, iPhone 11 Pro, iphone2019, Pro


Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!