Open jlewi opened 5 years ago
We'd like fairing to support batch prediction.
See also #38 support deploying models.
we'd like to be able to fire of a batch predict job from a notebook using fairing.
As in online predict (#38) there are likely two cases
In the case of #1 we can probably use our existing Beam transform for doing batch predict using a saved model.
In the case of #2 the user probably needs to write a batch_predict method that can then be invoked.
We should consider whether to use beam to parallelize the computation or maybe just fire off a bunch of K8s jobs.
/area engprod /kind feature
We'd like fairing to support batch prediction.
See also #38 support deploying models.
we'd like to be able to fire of a batch predict job from a notebook using fairing.
As in online predict (#38) there are likely two cases
In the case of #1 we can probably use our existing Beam transform for doing batch predict using a saved model.
In the case of #2 the user probably needs to write a batch_predict method that can then be invoked.
We should consider whether to use beam to parallelize the computation or maybe just fire off a bunch of K8s jobs.