Search⌘ K
AI Features

BERTje for Dutch

Explore how BERTje, a monolingual BERT model trained on Dutch corpora, supports natural language processing tasks like next sentence prediction and integrates with Hugging Face libraries. Understand its training data, pre-training methods including whole word masking, and how to apply BERTje for Dutch language applications.

BERTje is the pre-trained monolingual BERT model for the Dutch language from the University of Groningen. The BERTje model is pre-trained using MLM and sentence order prediction (SOP) tasks with whole word masking (WWM).

The BERTje model is trained using several Dutch corpora, including ...