AI & RoboticsNews

Google AI’s ALBERT claims top spot in multiple NLP performance benchmarks

Researchers from Google AI (formerly Google Research) and Toyota Technological Institute of Chicago have created ALBERT, an AI model that achieves state-of-the-art results that exceed human performance. ALBERT now claims first place on major NLP performance leaderboards for benchmarks like GLUE and SQuAD 2.0, and high RACE performance score.

On the Stanford Question Answering Dataset benchmark (SQUAD), ALBERT achieves a score of 92.2, on General Language Understanding Evaluation (GLUE) benchmark, ALBERT achieves a score of 89.4, and on ReAding Comprehension from English Examinations (RACE) benchmark, ALBERT gets a score of 89.4%.

ALBERT is a version of Transformer-based BERT that “uses parameter reduction techniques to lower memory consumption and increase the training speed of BERT,” according to a paper published Wednesday on OpenReview.net. The paper was published alongside other papers accepted for publication as part of the International Conference of Learning Representations, which will take place in April 2020 in Addis Ababa, Ethiopia. ICLR will be the first international AI community conference held in Africa.

“Our proposed methods lead to models that scale much better compared to the original BERT. We also use a self-supervised loss that focuses on modeling inter-sentence coherence, and show it consistently helps downstream tasks with multi-sentence inputs,” the paper reads.

ALBERT is the latest derivative of BERT to claim a top spot in major benchmark tests. In late July, Facebook AI Research introduced RoBERTa, a model that achieved state-of-the-art results, and in May, Microsoft AI researchers introduced Multi-Task Deep Neural Network (MT-DNN), a model that achieved top marks in 7 of 9 GLUE benchmarks.

Each of the models achieves performance that outpaces average human performance.

In other Transformer-related news, Hugging Face, a startup whose PyTorch library for easy use of major Transformer models like BERT, Open AI’s GPT-2 and Google’s XLNet today made that library available for TensorFlow. PyTorch-Transformers has seen more than 500,000 Pip installs since the start of year, Hugging Face CEO Clément Delangue told VentureBeat.

More to come.


Author: Khari Johnson
Source: Venturebeat

Related posts
GamingNews

Banana Castles, Frog Island, and Skinballs: Here Are Some of the Wacky Things Devs Do to Test Video Games

GamingNews

'It's Like Saying We Should Spend More Time on Tatooine With Farmer Luke' — Cyberpunk 2 Creative Director Says Extending 2077's Act 1 Wouldn't Have Made the Game Better

GamingNews

Resident Evil Requiem Trailer Reveals Bustling City Location, In What Looks Like a Huge Departure For The Survival Horror Series

CryptoNews

Bitcoin ETFs Roar Into 2026 Like a Lion — $1.2B in Two Days Signals $150B Wall of Money

Sign up for our Newsletter and
stay informed!