Comments
There's unfortunately not much to read here yet...
Follow the full discussion on Reddit.
So I built a neural network type thing that uses hebbian learning rules instead of backprop/gradient descent, and from what I can tell, I think it's learning MNIST in under 150 generations with only ~7500 neurons. Is this good? My issue is I'm not really sure how to test it to compare against similar networks. Can anyone help me understand if what I have is decent and what I can do to test it against other SOTA methods?
There's unfortunately not much to read here yet...
Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.
Discover the best guides, books, papers and news in Machine Learning, once per week.