combust / mleap

MLeap: Deploy ML Pipelines to Production
https://combust.github.io/mleap-docs/
Apache License 2.0
1.5k stars 312 forks source link

Will batch predicion be supported? #545

Open yaochitc opened 5 years ago

yaochitc commented 5 years ago

Seems it is not supported currently. As predict row by row is slow, I just wonder if this feature will be added.

ancasarb commented 5 years ago

Could you please provide a bit more detail on your use case? In which case you've found the row by row prediction slow? Thank you!

yaochitc commented 5 years ago

I'm using mleap with a deep learning based ranking model developed by bigdl, I need to score hunderds of items for each user. If the score process for one user is executed in a single batch, the prediction time is much shorter.