1. Training Stage
We build an n-gram model: the next word is predicted from the previous 1 word(s) (within a sentence).The 'cognitive' theatre
Words are arranged in two vertical lists beside two poles. Edges show bigram flows (word → next word) as a visible shadow of the higher-order n-gram model. Scroll down to see more words.Phase: idle
2. Performance & Narration
Ask the toy model to continue your prompt; the monitor will explain how it chooses each next word under the current n-gram order, animating context in yellow and the choice in green.Model output:
Inner narration:
The model is an n-gram model. For order n, it uses the last
n - 1 words of context (within a sentence) to choose the next word.
The two-pole graph always shows the immediate word-to-word flows.