Online models receive prediction requests from applications that often have to first contact a feature store to help construct the complete model input (feature vector). This integration work is both time consuming and error prone and is not needed when you serve models using KServe on Hopsworks.
In this talk, we will walk you through how to develop and deploy online models for the open-source KServe and Hopsworks platforms in a manner that requires little operational experience can easily put models in production.
We will show you how we can automatically generate a transformer script for KServe that joins application supplied features with feature store supplied features and applies online feature transformations before making the call on the model. We will also show how you can close the data-for-ai flywheel by enabling KServe predictions to be logged back to the feature store, where they can be turned into training data.
Attend this webinar to watch a live demo on how to put production quality online models into production without extensive MLOps expertise.
The Feature Store is the essential part of AI infrastructure that helps organisations bring modern enterprise data to analytical and operational ML systems. It is the simplest most powerful way to get your models to production. From anywhere, to anywhere.From months, to minutes.