
NLPAI
Why GPT Can't Count: Visualizing the BPE Tokenization Trap
Interactive visualization of Byte Pair Encoding (BPE) tokenization. Discover why GPT thinks 9.11 > 9.9 and watch the tokenizer chop your text in real-time.
December 5, 20248 min read
Loading...
Natural Language Processing
4 posts

Interactive visualization of Byte Pair Encoding (BPE) tokenization. Discover why GPT thinks 9.11 > 9.9 and watch the tokenizer chop your text in real-time.

Interactive visualization of RNN vs Transformer architecture. See why RNNs forget and how Transformers solve the vanishing gradient problem.

Interactive visualization of self-attention in transformers. See how LLMs decide which words matter using Query, Key, Value.

Interactive visualization of Word2Vec and word embeddings. See why good and great were strangers in one-hot encoding but neighbors in vector space.