Skip to main content
. 2021 Aug 2;10:e68066. doi: 10.7554/eLife.68066

Table 2. Example of a language model.

This model has seen three sentences at different probabilities. Rows represent the prediction for the next word, e.g., /I/ predicts /eat/ at a probability of 1, but after /eat/ there is a wider distribution.

I Eat Very Nice Cake
I 0 1 0 0 0
eat 0 0 0.2 0.3 0.5
very 0 0 0 1 0
nice 0 0 0 0 1
cake 0 0 0 0 0