Search⌘ K

BETO for Spanish

Explore how to use BETO, the Spanish BERT model, for masked word prediction. Discover its two variants, performance advantages over multilingual BERT, and practical coding methods with Hugging Face transformers.

BETO is the pre-trained BERT model for the Spanish language from the Universidad de Chile. It is trained using the MLM task with Whole World Masking (WWM). The configuration of BETO is the same as the standard BERT-base model.

Variants of BETO

The researchers of BETO provided two variants of the BETO model.

  • BETO-cased for the cased text.

  • BETO-uncased for the uncased text. ...

Performance of BETO