AI & RoboticsNews

Deep Fusion is the iPhone’s take on AI photography

In announcing the iPhones 11 Pro, Phil Schiller tipped us off to a new feature that’ll come to the flagship smartphones in the next year. Deep Fusion is a system which Schiller describes as “computational photography mad science,” which is likely to be the company’s answer, more or less, to Google’s Night Sight.

As Schiller explained, when you’re about to take an image with the new iPhone 11 Pro, the camera will snap 8 images before you press the shutter. When you do, it’ll then take one long exposure, and will then stitch a new image together, “pixel-by-pixel” to create one with lots of detail and very little noise.

It’s not specifically designed for shooting in the dark, but it’s clear that Apple is parking its tanks on Google’s lawn. Night Sight has been one of the strengths of the last few Pixel phones, using machine learning to create well-lit images in dark environments.

Schiller didn’t say when exactly we could expect Deep Fusion, but that it’ll arrive on devices in the coming months. Given the looming announcement of the Pixel 4, it’ll be very interesting to see if Google already has the next generation of its own technology ready to return fire.

Follow all the latest news from Apple’s 2019 iPhone event here!


Author: Daniel Cooper
Source: Engadget
Tags: Apple, cameras, Deep Fusion, gear, iPhone 11, iPhone 11 Pro, iphone2019, Pro


Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!