Quiz: BERT Variants—Based on Knowledge Distillation
Take a short quiz to test your understanding of different BERT variants based on knowledge distillation.
We'll cover the following...
We'll cover the following...
Technical Quiz
1.
(Select all that apply.) When applying softmax temperature in the output layer during knowledge distillation, what will a higher temperature value result in?
A.
Sharper probability distributions
B.
Smoother probability distributions
C.
Confident probability distributions
D.
Uncertain probability distributions
1 / 8
...