Open rhassan91 opened 8 months ago
also have this problem, would love to know if streaming responses is somehow supported
solved it using custom containers
solved it using custom containers
Would it be possible to help explain or share the base configurations of the custom container?
Hi i deployed the following scoring script on a managed online endpoint on azure ml using v2 but it fails to return a streaming response. However running the Azure ML inference server locally does ensure I get a streaming response back. Isnt the local deployment meants to replicate the deployed version in behavior? Is Azure ML Inference server capable of streaming responses?
`
from azureml.contrib.services.aml_response import AMLResponse import time
def init():
def run(request):
`