#concept

backpropagation >> the process of doing forward and backward passes repeatedly on a model in order to minimize the error, until you reach the stopping criteria.

References

  1. Build micrograd, a scalar-based neural network with Karpathy

Notes

Note that it seems like definitions of backpropagation range from multiple forward+backward passes, to only updating gradients within backward pass, to backward pass. Doesn’t seem like there is clear consensus.