Open withsmilo opened 2 years ago
@withsmilo we are in the design phase of a model monitoring solution where we offer APIs for logging features and inference results and configurations for shipping the logs to a destination of choice. If possible, we can get on a call walk through our design with you and verify that it meets your requirements.
@ssheng really great news! Thanks!
Hi, BentoML team. This is a new suggestion for Yatai. In general, when serving a ML model, the input provided to the model and the output returned by the model are stored in an external storage, and used for debugging or replay. This is a feature required by most ML services that require " feedback ", and it would be good if BentoML / Yatai provides this as a default.