Table 2. Example of a language model.
This model has seen three sentences at different probabilities. Rows represent the prediction for the next word, e.g., /I/ predicts /eat/ at a probability of 1, but after /eat/ there is a wider distribution.
| I | Eat | Very | Nice | Cake | |
|---|---|---|---|---|---|
| I | 0 | 1 | 0 | 0 | 0 |
| eat | 0 | 0 | 0.2 | 0.3 | 0.5 |
| very | 0 | 0 | 0 | 1 | 0 |
| nice | 0 | 0 | 0 | 0 | 1 |
| cake | 0 | 0 | 0 | 0 | 0 |