NewsPhotography

Historical footnote to technology of the future: three moments with the Canon EOS R3 that changed my opinion of Eye Control

The EOS R3’s Eye Control system works on a similar principle to the one used in the late 90’s and early 2000s, but isn’t really trying to select individual, discrete AF points

To me, Eye Control has always been some great ‘what if,’ that inevitably gets mentioned when camera geeks of a certain age spent too much time together. I went straight from a manual focus SLR to digital cameras, so I’ve never felt those pangs of nostalgia, myself: it just sounded like a failed technology that even Canon seemed happy to forget about.

But in the past months there have been three distinct moments that have made me think that, rather than being an evolutionary dead-end, the Eye Control concept (and maybe technologies similar to it) might come to be seen as a critical feature of cameras in the future.

Recognition of necessity

The first moment came when Canon, in the first of its drip-feeds of R3 specs, announced that it would be reviving the Eye Control idea. My initial thought was ‘well, technology has moved on long way since the early 2000s, they can probably get it to work better,’ but the more I thought about it, the more I wondered whether something like it has become necessary.

Back in the era of the EOS 5 and EOS-3, Eye Control was an interesting idea, but not really an essential one. The EOS 5 had just five AF points to select from, and even in its most ambitious implementation, the EOS-3’s system could move between forty five autofocus points. The new EOS R3 has one thousand and fifty three.

1998’s EOS-3 had the most sophisticated version of Eye Control at the time, with 45 AF points available. The new R3 has over twenty three times as many AF points to select from.

The move to mirrorless has seen AF arrays extend across almost the whole field-of-view, and the number of selectable AF points stretch into and through the hundreds. The joysticks and button presses we’ve become used to simply aren’t up to the job. Even Canon’s own infrared swipeable Smart Controller struggles.

Intuitive to the point of invisibility

The second pang of recognition came when we were shooting football (soccer) with the EOS R3. I’ve shot sports with high-end cameras before, but I wouldn’t consider myself particularly adept at it. I can sometimes set up the camera to do what I want, and I can sometimes focus enough on the action to anticipate what’s might happen next, but I’ve only had fleeting periods where those two have overlapped.

The Eye Control interface seems initially chaotic and overwhelming: there’s an AF box which is typically white, grey indicators around any potential subjects that the camera has recognized, which will occasionally go white if they’re very close to the AF point finally there’s the Eye Control target itself, dancing around the screen. Frankly, it seems too much.

As soon as I consciously noticed the Eye Control point was following the ball, I started looking at the control point instead, but having recognized that it was outlining the ball, my eye continued to follow the indicator/ball combination I’d created

But once I’d started shooting, I found myself just concentrating on following and predicting the movement in the game. Then, for a brief moment, I became aware again of the Eye Control indicator, perfectly matching the size, position and movement of the football as it rolled from one player’s feet to the next.

In that sudden moment of noticing the Eye Control point and what it was doing, I realized that I hadn’t been paying it any attention before: I’d just been watching the players and the camera had been focusing on the player I’d been looking at. And, while none of this helped me position myself in the right place on the sidelines for those decisive moments in the game, it did mean I came back with better action shots (and more of them) than I’m used to getting.

At its best, Eye Control isn’t just about more quickly selecting an AF point, it’s also about it doing so in a way that’s almost subconscious. I was setting focus without ever having to consciously think about focus, giving me one less thing to think about and leave me more able to concentrate on the game.

How will I live without it?

Will we really need eye control?

Of course there’s a chance I’m wrong. One of the key technologies that appears to make Eye Control work so well is the R3’s subject recognition, which means that you don’t need to place your AF point perfectly on the subject to get the camera to track it. And, if this works well, how much will sports photographers need to move their AF point at all?

If you can anticipate roughly where in the frame you want your subject to be, then maybe you’ll be able to pre-place your AF point with a joystick and then initiate tracking when it’s over them as you try to follow their movement.

If subject tracking becomes both quick to initiate and utterly dependable, then a feature that rapidly re-positions your AF point may be less pressing, after all.

The moment that really sold me on the idea of Eye Control came just a few hours later. We’d finished up at the sports field and we’d headed down to the zoo to get some more shots. Alongside shooting the EOS R3 I was meant to be capturing some behind-the-scenes photos for social media.

A perfect moment presented itself: Chris and Carey, long-lensed cameras in hand were walking into a patch of light between an arch of trees. Both were gesticulating at one another, animatedly discussing some camera feature or another. Perfect, I thought, raising the camera to my eye.

I found that I’d become so used to a feature after just a few hours’ shooting, that it’s become something I want in my next camera.

I looked determinedly at Chris, and waited for him to take one step further forward. But, just as I was about to hit the shutter, I saw that the AF point was stubbornly glued to the lower right corner of the viewfinder. Had I messed up the calibration? Was there a blind spot which pushes the AF target beyond a point from which I could coax it back? What was I doing wrong? It was only when Chris stepped fully into shadow and the moment had passed that I noticed that there was no Eye Control indicator to be seen. And why would there be? The EOS R6 I was holding doesn’t have Eye Control AF.

In that third moment of realization, I found that I’d become so used to a feature after just a few hours’ shooting, that it’s become something I want in my next camera. From historical footnote to a technology of the future in no time at all. Now we just have to see how reliable it is.

Read our entire in-depth coverage of the Canon EOS R3


Author:
Richard Butler
Source: Dpreview

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!