DefenseNews

US Air Force’s T-38 trainer could soon dogfight with augmented reality adversaries

WASHINGTON — In the future, when U.S. Air Force fighter pilots face off in aerial combat training missions, they could be dogfighting the video game version of Chinese and Russian warplanes at a fraction of the cost of using real jets like the F-22 Raptor.

At least that’s the pitch the California-based company known as Red 6 is making to the service.

For the past three years, Red 6 has been working with the Air Force to mature its Airborne Tactical Augmented Reality System, or ATARS, which allows pilots flying real fighters to see projections of other aircraft through their helmet visor.

Now the company is on the brink of finalizing a Small Business Innovation Research Phase III contract with the Air Force that will allow it to integrate its technology with a Northrop T-38 Talon — the supersonic jet trainer used to train fighter pilots, according to its founder and CEO, Dan Robinson.

“We’re working in partnership with the test community at Holloman Air Force Base to do that, and then we’re going to be working hand in hand with them to evolve that and make sure that it’s ruggedized,” he told Defense News in a March 17 interview.

The company is also using internal funding to network multiple ATARS together so that more than one aircraft can train as a group against a larger set of adversaries. The hope is to demonstrate that capability for the Navy and Air Force sometime this year, Robinson said.

Unlike virtual reality, where everything the user sees through a headset is simulated, augmented reality superimposes simulated images over the real world.

The ATARS system comprises a custom augmented reality headset designed to be worn with a standard HGU-55 helmet used by F-15 and F-16 pilots. It also includes hardware and software responsible for tracking the pilot’s head in space and displaying information, all driven by a game engine.

“It’s minimally intrusive. It’s designed to be platform agnostic,” Robinson said. “It will incorporate into any aircraft, and there will be slight adjustments to determine antenna positioning and things like that.”

If Red 6 proves that its technology works, it could solve one of the Air Force’s longstanding requirements: the need for cost-effective “red air” training that gives fighter pilots experience in close-quarters aerial combat.

In 2019, the Air Force awarded a contract to seven companies that provide red air training services: Air USA, Airborne Tactical Advantage Company, Blue Air Training, Coastal Defense, Draken International, Tactical Air Support and Top Aces.

These companies typically buy up used third-generation subsonic fighters and hire retired military pilots to pose as aggressors in training missions. However, to train against more advanced threats, the Air Force has to use its own aircraft as aggressors.

Currently, there is no way for the Air Force to virtually represent an adversary once a threat gets within visual range, or about 10 nautical miles.

Using tools like threat emitters that replicate the radiofrequencies emitted by missiles, artillery and other aircraft is sufficient to train pilots beyond visual ranges, Robinson said earlier this week at an event held by the Mitchell Institute for Aerospace Studies. But with augmented reality, the Air Force would be able to virtually project — for example — a combat-representative Russian Su-57 that the pilot must dogfight.

“So far, augmented reality hasn’t worked outdoors or in dynamic environments,” Robinson said. “It does now.”

The path forward

Although there is no current program of record for the solution Red 6 could provide to the military, the company has some notable former Air Force leaders on its side.

In February, Red 6 appointed Mike Holmes, the former four-star general who led Air Combat Command, as chairman of its board. The company added former acquisition executive Will Roper to its advisory board earlier this month.

Red 6 has received buy-in from sources such as the Air Force’s AFWERX innovation hub — having been awarded Small Business Innovation Research Phase I and II contracts — and received an investment of undisclosed value from Lockheed Martin’s venture capital arm in June 2020.

During the SBIRS Phase III contract, Robinson plans to work with the Air Force on solidifying a business model for providing the technology as well as finalizing the cost. But to get that money, lawmakers will need proof that the technology works, said Robert “Otis” Winkler, a professional staff member for the Senate Armed Services Committee.

“Moving it into a training environment where you’re actually projecting images that aren’t necessarily there is something that we’re going to have to prove to folks as far as latency, as far as the ability to make a difference in training,” he said during the Mitchell Institute event.

However, advancements like the F-35 helmet-mounted display — which overlays imagery from the jet’s sensors into a singular picture projected onto the helmet — supports the idea that pilots can wear highly sophisticated optical systems while conducting normal flight operations.

“I think when people actually experience this, it becomes hard to argue,” he said. “I mean, if you look at what are our kids are doing when it comes to augmented reality, in the video games that they are playing, that’s kind of the expectation when folks show up in the military. And I think most of them are disappointed.”

As the company figures out how to link multiple ATARS systems together across large distances, it will need to overcome the challenge of latency, Robinson said.

A solution to that problem is something that companies like Red 6 might be able to learn from large, global, multiplayer games such as Fortnite, which uses techniques such as pose estimation and predictive analysis to ensure all users are sharing the same experience, said Robinson.

“On multiplayer games … the movements are much more nuanced and rapid — so for example, a soldier crouching behind a desk,” he said. “What we’re talking about is fighter combat, where … the realities of the physical limitations of the airplane are a little more predictive.”


Author: Valerie Insinna
Source: DefenseNews

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!