Hopsworks Feature Store with

Airflow

Airflow can be used to orchestrate feature, training, and batch inference pipelines as Python programs that write/read to/from the Hopsworks Feature Store. Hopsworks also provides an Airflow Operator that can be used to run Python or Spark jobs on Hopsworks.

Hopsworks Integrations

Airflow can be used as the orchestration engine for all your machine learning pipelines (feature pipelines, training pipelines, and batch inference pipelines).

Github: Hopsworks & Airflow

Other integrations

AWS Sagemaker
Spark (EMR, Databricks, Cloudera, HDInsight, DataProc)
Synapse Azure

© Hopsworks 2024. All rights reserved. Various trademarks held by their respective owners.

Privacy Policy
Cookie Policy
Terms and Conditions