Order of layers in MLP

Follow the full discussion on Reddit.
There have been lots of debates about dropout and batch normalization, my question however is, as I do not have much experience with the 2 together nor is my computer strong enough to test without huge time losses, what order should everything be in? To me, it seems most logical if it went: {...}->{(dense layer)->(batch normalization)->(activation)->(dropout)}->{...}; though many seem to disagree.

Comments

There's unfortunately not much to read here yet...

Discover the Best of Machine Learning.

Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.

Join over 900 Machine Learning Engineers receiving our weekly digest.

Best of Machine LearningBest of Machine Learning

Discover the best guides, books, papers and news in Machine Learning, once per week.

Twitter