Quiz

We'll cover the following...
Technical Quiz
1.

Why do we use word embeddings?

A.

They’re used to represent a sequence of words as a single continuous vector

B.

Word embeddings are easier to look up than regular tokenized IDs

C.

It makes training models for NLP tasks quicker and more efficient

D.

As a more meaningful way to capture the connections between vocabulary words


1 / 4