awslabs / multi-model-server

Multi Model Server is a tool for serving neural net models for inference
Apache License 2.0
994 stars 231 forks source link

define inference proto #951

Open lxning opened 3 years ago

lxning commented 3 years ago

This feature requires

  1. define inference messages in proto
  2. apply protobuf encode/decode in inference endpoints
  3. add test cases