NewsPhotography

Results From This Blind Smartphone Camera Test Will Surprise You

Marques Brownlee, also known as MKBHD, hosted his third annual, 17-minute blind smartphone camera test featuring 20 new smartphones. The phones were grouped into a bracket that hid the phone names while the public voted for their favorites.

First of all, it’s important to note right from the start that this “test” is the furthest thing from scientific in nature.

“This isn’t a scientific test at all,” Brownlee says. “In fact, it’s kind of the opposite of a scientific test.”

The point of this playoff-style bracket isn’t to objectively claim one camera is better than another, but rather serves as a good case study for what people think makes a photo “good.” By the end, and after over 10 million total votes were cast, Brownlee was able to point out some interesting conclusions from the information he gathered.

The concept of the test is simple: Brownlee put together a seeded bracket (seeds were determined by his team, and in the end seeds honestly did not seem to matter very much) and associated each camera in the test with a letter. Here is the bracket as those who voted on the test saw it before the first images were posted:

Those who would be voting had no idea which phones were up against each other round by round, which was the point. Brownlee wanted to see how people would vote based purely on the images that were taken and nothing else. In each round, all smartphones would be placed in the exact same position and photograph the exact same subject under the exact same circumstances. In this way, the only differences in how a photo looked were based purely on how each smartphone is programmed to capture an image. Brownlee went so far as to not even tell the camera where to focus, leaving that up to the smartphone as well.

All the images that Brownlee took may look simple and unchallenging, but that was the point. The idea was to create scenes that could very easily happen in everyday life, but also integrate challenging aspects to each image that may not be immediately noticeable. In one image there might be a wide mix of shadows and highlights which would test each camera’s dynamic range, while in another there might be a lot of textures and competing colors that would show how each sensor adapted to the differences.

The photos were posted to both Twitter and Instagram stories as each allow for polls. The first round used this photo:

The second round used this image:

The third round used this photo:

And the final round used this photo:

After the polls for all rounds closed, Brownlee revealed which cameras people voted on:

At the beginning of his video, Brownlee makes two important notes: First, each year he has done this test, no camera brand has ever repeated a win. Second, the iPhone has never once made it out of the first round.

As you can see in the finished bracket with the new winner, both those notes remained true after this year’s test:

So, naturally, the next question would be to ask, “Is the Asus Zenfone 7 Pro the best smartphone camera?” The answer to that is probably, “well, not necessarily.” As he stated in the beginning, this wasn’t a scientific test and the results weren’t intended to necessarily crown one device the best camera.

What it did do was indicate what the general population of voters finds most attractive about a given image. Brownlee makes several interesting notes about the images he and his team chose to take, and why he thinks the iPhone in particular continues to struggle in this competition.

“White balance has been a major, key new factor from our understanding from the smartphone bracket this year,” he says. “And I would go so far as to say that it looks like this has been the reason that the iPhone has lost in the first round every single time.”

White balance appears to be a huge factor in what makes an image appealing, and Apple’s iPhones seem to lean more towards blue than other cameras on the market. What this does is makes warmer tones – such as his skin tone, as Brownlee points out – appear oddly hued.

Brownlee says that what he and his team deduced is that when content and brightness are the same, people will choose a photo with slightly better color saturation. Now there is of course a limit to this, as oversaturation can make images look terrible, but a correct white balance also has a direct impact on how the eye perceives the saturation of certain tones. The cooler the white balance, the more saturated blues will look. The warmer the white balance, the more saturated warmer colors like orange and red will appear.

So while the iPhone 12 Pro Max photo has a lot of detail and sharpness (the iPhone was phone “M”), it was the coolest color balance of the group and the boosted exposure blew out the sky in the background. This combination led it to once again lose to other smartphones that handled color and dynamic range better.

So why do some phone consistently have such a cool tint to the white balance? While his guess is as good as anyone’s since he doesn’t know for sure, the answer he comes up with might date back to the origins of photography, and how even film was designed.

“On photos of people with fairer skin tones, which is most people, it doesn’t affect the skin tone look quite as much,” Brownlee speculates. “You can get away with it. And also, blue skies will look more blue than they would if you were biasing warm.”

Even more interesting (or frustrating, depending on how you look at it), Brownlee discovered that it’s not just what your smartphone captures, but how much the hosting service you choose for that photo matters as well. In the final photo showdown, the image of the two pumpkins looked different between Twitter and Instagram, leading to a pretty notable disparity in the voting between the two platforms.

Watching Brownlee’s full conclusion and discussion of this year’s results is definitely worth the time. While crowning a single smartphone a winner is interesting, even more so is the philosophic discussion at the end. How the general public sees images and what elements of a photo they tend to value is helpful information for any photographer.

For more from Marques Brownlee make sure to subscribe to his YouTube Channel.


Author: Jaron Schneider
Source: Petapixel

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!