Cross-entropy
Cross-entropy
◀ Prev | 2026-01-12, access: $$$ Pro
Video basics math theory training Entropy is the negative logarithm of probability, averaged over all outcomes. Cross-entropy is a similar calculation, involving logs of probabilities from one distribution averaged over a different distribution. These concepts form an excuse for reading Claude Shannon's classic paper A Mathematical Theory of Communication; and cross-entropy in particular is the most popular loss function for language model training.
Click here to log in to your account, or here to sign up for a free account.