From predicting the weather to generating Shakespeare, discover the "memoryless" math that powers modern Computational Linguistics.
(1856-1922)
"The future depends on the present, not the past."
Imagine a system that "forgets" its history. It doesn't matter how it arrived at its current state; all that matters is where it is right now.
Developed by Russian mathematician Andrey Andreyevich Markov, this groundbreaking model is used in economics, physics, and notably, Computational Linguistics. Whether it's predicting stock volatility or the next word in a sentence, Markov models provide the foundation.
The text describes a "Magical Box" that changes form based only on what it is right now.
Click the button to see the Markov Property in action!
Sometimes the state is hidden (like treasure in a room), and we only see an observation (like the color of a box a guard carries).
Imagine playing a game with word cards. One side has the word (Observable), the other has the Part of Speech (Hidden).
The happy puppy runs
HMMs calculate the likelihood of a sequence. "Adjective" often precedes "Noun", helping the computer guess that "Happy" is an adjective.
Scenario: You see a guard carrying a RED BOX. Where is the treasure hidden?
This detective calculates the probability of being in a specific state at a specific time, given the entire sequence of observations.
Unlike the previous detective, Viterbi is only interested in the single most likely path of hidden states.
Turning sound waves (observed) into written words (hidden). HMMs handle accents, speed, and noise.
Spotting specific entities like names, cities, or companies in a wall of text using context patterns.
Learning the style of Shakespeare to generate new plays. Predicting the next word based on current state.