Comments
There's unfortunately not much to read here yet...
Follow the full discussion on Reddit.
Transformers heavily rely on MLPs. Presumably, facts which LLMs can recall are stored in MLPs. (E.g. see ROME paper.) These MLPs are huge.
There's unfortunately not much to read here yet...
Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.
Discover the best guides, books, papers and news in Machine Learning, once per week.