Intelligence.Log
2022-10-11
Extracted: 1 items. Sources: YouTube.
YT
We take the 2-layer MLP (with BatchNorm) from the previous video and backpropagate through it manually without using PyTorch autograd's loss.backward(...
👁 335.4k Views|Andrej Karpathy
"This video demonstrates manual backpropagation through a complete 2-layer MLP with BatchNorm, covering gradients from cross entropy loss through embedding tables. It builds intuitive understanding of gradient flow at the tensor level, beyond scalar implementations like micrograd, while reinforcing core deep learning concepts."
-- END OF LOG --
[STATS] 1 items · Filter applied
Powered by Horizon + DeepSeek