awslabs / multi-model-server

Multi Model Server is a tool for serving neural net models for inference
Apache License 2.0
998 stars 230 forks source link

support protobuf #957

Open lxning opened 4 years ago

lxning commented 4 years ago

Before or while filing an issue please feel free to join our slack channel to get in touch with development team, ask questions, find out what's cooking and more!

Issue #, if available:

951

Description of changes:

  1. define inference message proto

    • add inference.proto
    • update gradle to support protobuf build
  2. support protobuf encode/decode in inference channel

    • java code changes in worker data flow (ie. wlm dir)
    • java code changes in http handler (ie. http dir)
    • java code changes in tools (ie. util dir)
  3. add unit test cases

    • testPingProto
    • testPredictionsProto
    • testPredictionsModelNotFoundProto
  4. fix mms-ci-build issues

    • ci/Dockerfile.python3.6
    • mms/tests/unit_tests/test_beckend_metric.py
    • mms/tests/unit_tests/test_worker_service.py

Testing done:

Run unit test at local.

To run CI tests on your changes refer README.md

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.