Cross-Entropy Loss Demo

True Label: Cat (class 0)
Cat
y = 1
0.70
Dog
y = 0
0.20
Bird
y = 0
0.10
L = -log(P(correct class)) = -log(0.70) = 0.36
Cross-Entropy Loss
0.36