During Apples keynote last month where it announced the iPhone 11 lineup, the company also unveiled new camera technology called Deep Fusion that takes four frames before you hit the shutter, four more once you do, and one long exposure shot. The 8-core Neural engine will select the best frames and create a high-quality HDR photo.
The resulting images are highly detailed, sharper, and more natural-looking. The machine learning part of the Neural processor will analyze the image being taken and process differently depending on whether it sees sky, foliage, or skin tones. Meanwhile, structure and color tones are based on ratios obtained by the Neural unit on the A13 processor.
Deep Fusion is only compatible with the latest iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max.
The feature will be coming with iOS 13.2 and those with developer access to the iOS Beta can begin testing Deep Fusion now. Those on the Public Beta will be able to test this feature soon enough. We are intrigued to test this feature out and see what kind of images the iPhone 11 trio can produce with Deep Fusion.
Check out the latest Apple iPhones at great prices from Gizmofashion – our recommended retail partner.
Author: Ricky
Source: GSMArena