neuralmagic / deepsparse

Sparsity-aware deep learning inference runtime for CPUs
https://neuralmagic.com/deepsparse/
Other
3.04k stars 173 forks source link

Update server docs to use v2 infer endpoints #1643

Closed mgoin closed 7 months ago

mgoin commented 7 months ago

Many doc files were still using 'http://0.0.0.0:5543/predict' as the server inference endpoint, which should now be 'http://0.0.0.0:5543/v2/models/TASK_NAME/infer'