AI & RoboticsNews

Facebook’s new technique helps AI systems forget irrelevant information

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Facebook says it has developed an AI technique that enables machine learning models to only retain certain information while forgetting the rest. The company claims that the operation, Expire-Span, can predict information most relevant to a task at hand, allowing AI systems to process information at larger scales.

AI models memorize information without distinction — unlike human memory. Mimicking the ability to forget (or not) at the software level is challenging, but a worthwhile endeavor in machine learning. Intuitively, if a system can remember 5 things, those things should ideally be really important. But state-of-the-art model architectures focus on parts of data selectively, leading them to struggle with large quantities of information like books or videos and incurring high computing costs.

This can contribute to other problems like catastrophic learning or catastrophic interference, a phenomenon where AI systems fail to recall what they’ve learned from a training dataset. The result is that the systems have to be constantly reminded of the knowledge they’ve gained or risk becoming “stuck” with their most recent “memories.”

Several proposed solutions to the problem focus on compression. Historical information is compressed into smaller chunks, letting the model extend further into the past. The drawback, however, is “blurry” versions of memory that can affect the accuracy of the model’s predictions.

Facebook Expire-Span

Facebook’s alternative is Expire-Span, which gradually forgets irrelevant information. Expire_span works by first predicting which information is most important for a task at hand, based on context. It then assigns each piece of information with an expiration date such that when the date passes, the information is deleted from the system.

Facebook says that Expire-Span achieves leading results on a benchmark for character-level language modeling and improves efficiency across long-context workloads in language modeling, reinforcement learning, object collision, and algorithmic tasks.

The importance of forgetting

It’s believed that without forgetting, humans would have basically no memory at all. If we remembered everything, we’d likely be inefficient because our brains would be swamped with superfluous memories.

Research suggests that one form of forgetting, intrinsic forgetting, involves a certain subset of cells in the brain that degrade physical traces of traces of memories called engrams. The cells reverse the structural changes that created the memory engram, which is preserved through a consolidation process.

New memories are formed through neurogenesis, which can complicate the challenge of retrieving prior memories. It’s theorized that neurogenesis damages the older engrams or makes it harder to isolate the old memories from newer ones.

Expire-Span attempts to induce intrinsic forgetting in AI and capture the neurogenesis process in software form.

Expire-Span

Normally, AI systems tasked with, for example, finding a yellow door in a hallway may memorize information like the color of other doors, the length of the hallway, and the texture of the floor. With Expire-Gan, the model can forget unnecessary information processed on the way to the door and remember only bits essential to the task, like the color of the sought-after door.

To calculate the expiration dates of words, images, video frames, and other information, Expire-Span determines how long the information is preserved as a memory each time a new piece of data is presented. This gradual decay is key to retaining important information without blurring it, Facebook says. Expire-Span essentially makes predictions based on context learned from data and influenced by its surrounding memories.

For example, if an AI system is training to perform a word prediction task, it’s possible with Expire-Span to teach the system to remember rare words like names but forget filler words like “the,” “and,” and “of.” By looking at previous, relevant content, Expire-Span predicts if something can be forgotten or not.

Facebook Expire-Span

Facebook says that Expire-Span can scale to tens of thousands of pieces of information and has the ability to retain less than a thousand bits of it. As a next step, the plan is to investigate how the underlying techniques might be used to incorporate different types of memories into AI systems.

“While this is currently research, we could see the Expire-Span method used in future real-world applications that might benefit from AI that forgets nonessential information,” Facebook wrote in a blog post. “Theoretically, one day, Expire-Span could empower people to more easily retain information they find most important for these types of long-range tasks and memories.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member


Author: Kyle Wiggers
Source: Venturebeat

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!