brnaguiar / mlops-next-watch

MLOps project that recommends movies to watch implementing Data Engineering and MLOps best practices.
Other
0 stars 1 forks source link

Use lighter runtimes when doing real-time inferences #20

Closed brnaguiar closed 1 year ago

brnaguiar commented 1 year ago
brnaguiar commented 1 year ago

Won't do because this is a batch inference project, but we use Spark Pipelines with Custom Transformers, so Pipeline serialization to lighter runtime environments becomes trivial.