Are feature stores still relevant in the era of LLMs? They have gone out of fashion, but the problems they solve in building AI systems still need to be solved - for both LLM and ML systems:
- How to I provide historical and contextual data to hosted models
- How do I centralize and govern all my data for AI - including both historical data for training and production data for inference
- How do I create point-in-time consistent training and inference data from time-series data sources
- How do I ensure higher data quality for training and inference
- How do I monitor models and features in production
We will look at feature stores are evolving to be the engine for RAG and MCP servers. We will also look at how the highest value creating AI systems are now real-time AI systems, and how you can build real-time AI systems with feature stores.