MAX: Open Deep Learning models on Docker containers

Follow the full discussion on Reddit.
Hello! I work for an open-source team at IBM. For a year now we have been working on a project called Model Asset eXchange (MAX). The goal of this project is to standardize DL model deployment and consumption. The idea is to make it easier to integrate DL models into web apps and services or deploy it on any cloud platform. So far we have around 25 models as part of this project. Most underlying models themselves are SOTA open-sourced models from various sources and model zoos (Tf/PyTorch/google research/IBM research etc). The value addition that this project offers is a standardized interface to any model using REST API, containerization and optimizations during inference such as loading the graph just once but performing inference based on every API call. Each model has its own github repo and for convenience, we have also hosted the Docker container on a public endpoint for people to try it out. Where possible we have also extended deployment channels to other avenues, such as NodeRed (npm), CodePen, demo web apps, etc. I would like your feedback/suggestions and of course, welcome any issues/pull request on the underlying github repos as well!

Comments

There's unfortunately not much to read here yet...

Discover the Best of Machine Learning.

Ever having issues keeping up with everything that's going on in Machine Learning? That's where we help. We're sending out a weekly digest, highlighting the Best of Machine Learning.

Join over 900 Machine Learning Engineers receiving our weekly digest.

Best of Machine LearningBest of Machine Learning

Discover the best guides, books, papers and news in Machine Learning, once per week.

Twitter