Closed nitsky closed 2 years ago
Right now it is possible to make predictions from the CLI over stdin/stdout with tangram predict, but it would be nice to add an HTTP server for serving predictions which could be started like so: tangram serve --model <MODEL_PATH>.
tangram predict
tangram serve --model <MODEL_PATH>
Right now it is possible to make predictions from the CLI over stdin/stdout with
tangram predict
, but it would be nice to add an HTTP server for serving predictions which could be started like so:tangram serve --model <MODEL_PATH>
.