Comments
There's unfortunately not much to read here yet...
Follow the full discussion on Reddit.
I used Google's Trax library to train a Reformer model with 65k context length on Wikipedia. This post includes the code, takeaways, weights, and samples from the model as well as a repo and Colab for others to finetune it.
There's unfortunately not much to read here yet...
Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.
Discover the best guides, books, papers and news in Machine Learning, once per week.