#concept-pamphlet
What is micrograd? Why is it called micrograd? ? A simple repo by Andrej Karpathy to show how neural networks are trained under the hood in a very simple way. It is a scalar-valued autograd, or automatic gradient/differentiation calculator. It takes out any efficiency complexities like using matrices.
References
Notes
- you can even do backpropagation on tan h only. You can do backpropagation on any composite function