AI & RoboticsNews

Hands-on: Adobe Photoshop Camera uses AI to redefine mobile photo editing

It’s nine in the morning on a rainy December 23, and my iPhone has just photographed a supermoon from my front door — no easy feat since the real moon is a distant and barely visible waning crescent today, and I’m shooting in pure daylight. Even by the latest and greatest smartphone camera technology standards, my device has captured an impossible image, yet when I look at my device’s screen, I see exactly what you’re seeing above, only moving live.

That’s the promise of Adobe Photoshop Camera, which was announced at Adobe Max in November along with so many other Adobe apps that you might have missed it. In a nutshell, Photoshop Camera is Adobe’s play for mobile device users who use Instagram and Snapchat to shoot and edit photos, cutting Adobe’s photo apps out of the loop. Currently in beta, the free app uses AI to identify what the smartphone camera is seeing in real time, then applies Photoshop effects on the fly, without the need to open a separate app or wait for post-processing.

Adobe said at Max that Photoshop Camera uses its Sensei AI platform, though it’s worth noting that the AI recognition features don’t require a live internet connection. Once a “lens” — potentially combining a color filter with animations and specific object-recognition abilities — has been downloaded to your iOS or Android device, its effects can be applied offline. And with over 20 initial lenses to choose from, including filters that would otherwise have required Photoshop or similar tools, users will have plenty to play with when the app becomes widely available.

In beta form, Photoshop Camera includes lenses such as Dreamcatcher, which applies a dreamy animated background and static foreground to whatever the front or rear camera sees, while adjusting the real image’s colors to match the other elements. Multiple comic book filters live inside Pop Art, which instantly applies a cartoony, Take On Me-style filter to faces while replacing the background and color-shifting people. A lens developed with musician Billie Eilish gives people angel or demon wings, complete with heaven or hell backdrops.

Conceptually, all of these effects might sound simple — and like things Photoshop has enabled creatives to churn out for years. The real trick in Photoshop Camera is completely eliminating the need for a human creative to do post-processing, and making the process so fast and automated that it happens multiple times per second. A machine learning-trained AI process is performing live image segmentation, screening one or more people out from their backgrounds and foregrounds, then differentially applying new background art and objects that track with their movements.

As you can see in the rightmost image above, Adobe’s AI is correctly screening my daughter’s body out from both the bench she’s sitting on and the table in front of her, applying the blue-green color filter only to the person. Or at least, mostly. As you watch Photoshop Camera working live on a device, you can see it making split-second changes to perceived edges of hair, clothing, and other objects. Those edges have proved tough for even 3D depth-sensing cameras to manage, so it’s no surprise to see a software-based solution struggling a little at the seams, as well.

Some of Photoshop Camera’s other lenses are more subtle. Like a similar software trick pioneered by Google, Portrait (above) uses the same image segmentation technology to artificially soften backgrounds without the need for a second or depth-sensing camera, while Studio Light mimics the Portrait Lighting feature Apple introduced in 2017. Another lens, Food, lets you choose from seven different automatic adjustments designed to improve the immediate results of food photography. In limited testing at a restaurant, I didn’t find Food’s results to be noticeably better than what my iPhone 11 Pro achieves on its own, but your mileage may vary.

The difference between Photoshop Camera’s results and your raw out-of-phone images may seem bigger if you’re using a less capable device than a top-of-line iPhone. You’ll also enjoy using it if you really love applying serious special effects to your images, and want something different from Instagram or Snapchat AR lenses, with bigger, cleaner photos as outputs. You’ll always have the ability to edit them further manually, either with other mobile-first tools or using a more powerful version of Photoshop.

In short, Photoshop Camera’s use of AI has serious potential to raise the table stakes for the next generation of smartphone cameras — and bring tomorrow’s low-end cameras closer to the performance of today’s higher-end ones. While the free app is still in beta for iOS and Android, expect it to become widely available through Apple’s and Google’s app stores in the near future.


Author: Jeremy Horwitz
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!