TensorFlow Model Serving using KServe: A Step by Step Guide

To setup KServe it is required to have a Kubernetes cluster setup. So we will be using Minikube for Kubernetes cluster setup locally. Minikube is a tool that allows you to run a single-node Kubernetes cluster on your local machine. It is designed to be a lightweight and easy-to-use solution for developers who want to experiment with Kubernetes, develop applications, or test deployments in a local environment without needing access to a full-scale Kubernetes cluster.

Machine Learning Model Deployment using Docker Container

Model deployment is the next and very important steps once you finalized your model training and development. There are many methods available to deploy the models depending upon the type of serving. There are many serving methods like batch serving, online serving, real time serving or live streaming based serving. In this article I am going to explain one of the deployment mechanism which does online serving using APIs. So I will be explaining how to deploy models using Docker container and run them on production efficiently and reliably.

Feature Store in Machine Learning

Feature store in machine learning is the concept to store features in both online and offline stores for model training and serving purposes. Feature store make sure to provide the consistency between the data used for model training and the data used during online serving to models. In other words, it guarantees that you’re serving the same data to models during training and prediction, eliminating training-prediction skew. Feast is one of the open source tools used for feature store.

1 2 3 5