Pushing CIFAR10 SOTA using ResNet family of models

Follow the full discussion on Reddit.
The goal of this project is to first replicate the ResNet SOTA results on CIFAR10 and use several recently published updates to push this state of the art as high as possible. Using such updates, I was able to achieve an error rate of 6.90% on the CIFAR10 test set, using a 20-layer ResNet that consists of mere 0.27M parameters. For comparison, the original ResNet20 had an error rate of 8.75%. The performance of this 20-layer model is comparable with that of the original ResNet56 which was reported to have an error rate of 6.97%. Additionally, after including ResNeXt blocks, I was able to further reduce the error rate to 5.32%.

Comments

There's unfortunately not much to read here yet...

Discover the Best of Machine Learning.

Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.

Join over 900 Machine Learning Engineers receiving our weekly digest.

Best of Machine LearningBest of Machine Learning

Discover the best guides, books, papers and news in Machine Learning, once per week.

Twitter