My learnings of Karpathy 1-3

Andrej Karpathy has put together one of the most awesome video blogs on NNs and I finished watching the first three of them. Wanted to put out some of my ramblings on the same. These are my notes so that I internalize them well and also so that I do not forget :) Have become a bit forgetful of late (blame it on the forties! :| ).

Lecture 1: Bacpropagation is the core of any modern NN .
Derivative of function: If you slightly bump up f(x) by h how does it respond. f(x + h) - f(h) / h.

Lecture 2: Bigram Model.


Lecture 3: Basic Language model

Comments

Popular posts from this blog

MinHash, Bloom Filters and LSH

Perceptron Algorithm