Facebook research suggests pretrained AI models can be easily adapted to new languages
November 14, 2019
Multilingual masked language modeling involves training an AI model on text from several languages, and it’s a technique that’s been used to great effect. In March, a team introduced an architecture that can jointly learn sentence representations for 93 languages belonging to more than 30 different families. But most previous work in language modeling has investigated cross-lingual transfer…