Quiz: Applying BERT to Other Languages
Take a short quiz to test your understanding of the multilingual BERT model.
We'll cover the following...
We'll cover the following...
Technical Quiz
1.
What is the primary motivation behind oversampling and undersampling in multilingual BERT?
A.
To prioritize high-resource languages for training.
B.
To exclude the low-resource languages from training.
C.
To maintain a balanced data distribution in each language.
D.
To randomly shuffle the training data to achieve balance.
1 / 8
...