openknowledge / mlops

MLOps showcase, in which we will go through the various phases of a Machine Learning project.
Apache License 2.0
0 stars 1 forks source link

Decouple Prediction Request #19

Open DJCordhose opened 1 year ago

DJCordhose commented 1 year ago

Getting a prediction currently only works synchronously, so if prediction service is not available, it will simply fail.

Prediction might be offline, though, because it is simply down or there is no reliable model available.

It might make sense to decouple the request from the actual prediction using Kafka or similar streaming / queuing / messaging services.