awslabs / multi-model-server

Multi Model Server is a tool for serving neural net models for inference
Apache License 2.0
998 stars 230 forks source link

Config value can have environment var at any position #841

Closed vdantu closed 5 years ago

vdantu commented 5 years ago

Before or while filing an issue please feel free to join our slack channel to get in touch with development team, ask questions, find out what's cooking and more!

Issue #, if available:

These changes can take the following config

$ echo $INFERENCE_PORT
8080

$cat config.properties
inference_address=http://0.0.0.0:$$INFERENCE_PORT$$

MMS will replace the inference address as http://0.0.0.0:8080

Description of changes:

Testing done:

To run CI tests on your changes refer README.md

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.