Serverless machine learning
Machine learning (ML) has been known to run on virtual machines (VMs) or on services which use through some orchestration these VMs. Engineers having previously worked in serverless environments having known its benefits, often question could the same serverless environment be applicable to ML.
To answer this, I came across this website which is a community for building ML systems using Python and MLOps best practices.
An interesting overview sharing the perspective of Serverless Machine Learning. In short, sharing an excerpt from the same link:
Serverless machine learning solves the problem of how to build and operate supervised machine learning systems in Python without having to first learn how to install, configure, and operate complex computational and data storage infrastructure.
The community through this site have shared important resources in the context of serverless ML: Blogs MLApps - A collection of projects, predictions services, tools, tutorials and examples that are built with serverless ML tools and concepts. Serverless Machine Learning Course - Free online course to get you comfortable with Serverless ML by building a prediction service. The course is available as part of this youtube channel.
Hope you use these resources to gather knowledge related to Serverless Machine Learning!
LEAVE A COMMENT
Comments are powered by Utterances. A free GitHub account is required. Comments are moderated. Be respectful. No swearing or inflammatory language. No spam.
I reserve the right to delete any inappropriate comments. All comments for all pages can be viewed and searched online here. To edit or delete your comment: click the "Comments" link at the top of the comments section below where it says how many comments have been left (this will take you to a GitHub page with all comments for this page) --> find your comment on this GitHub page and click the 3 dots in the top-right --> click "Edit" or "Delete". Editing or adding a comment from the GitHub page also gives you a nicer editor.