DefenseNews

Army may swap AI bill of materials for simpler ‘baseball cards’

The U.S. Army is revising its artificial intelligence bill of materials effort following meetings with defense contractors.

The service last year floated the idea of an AI BOM, which would be similar to existing software bills, or comprehensive lists of components and dependencies that comprise programs and digital goods. Such practices of transparency are championed by the Cybersecurity and Infrastructure Security Agency and other organizations.

The Army is now pivoting to an “AI summary card,” according to Young Bang, the principal deputy assistant secretary for acquisition, logistics and technology. He likened it to a baseball card with useful information available at a glance.

“It’s got certain stats about the algorithm, its intended usage and those types of things,” Bang told reporters at a Pentagon briefing April 22. “It’s not as detailed or necessarily threatening to industry about intellectual property.”

The Department of Defense is spending billions of dollars on AI, autonomy and machine learning as leaders demand quicker decision-making, longer and more-remote intelligence collection and a reduction of human risk on increasingly high-tech battlefields.

More than 685 AI-related projects are underway across the department, with at least 230 being handled by the Army, according to a Government Accountability Office tally. The technology is expected to play a key role in the XM30 Mechanized Infantry Combat Vehicle, formerly the Optionally Manned Fighting Vehicle, and the Tactical Intelligence Targeting Access Node, or TITAN.

The goal of an AI BOM or summary card is not to reverse engineer private-sector products or put a company out of business, Bang said.

Rather, it would offer greater understanding of an algorithm’s ins and outs — ultimately fostering trust in something that could inform life-or-death decisions.

“We know innovation’s happening in the open-source environment. We also know who’s contributing to the open source,” Bharat Patel, a project lead with the Army’s Program Executive Office for Intelligence, Electronic Warfare and Sensors, told reporters. “So it goes back to how was that original model trained, who touched that model, could there have been poisons or anything?

Additional meetings with industry are planned, according to the Army.


Author: Colin Demarest
Source: DefenseNews

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!