and I'd like that the function batch_prediction returned a dictionary since my model does multitask training and I have the result of multiple tasks, not just one.
When I try it, I got the following error:
Exception in thread thread_worker:
Traceback (most recent call last):
File "/home/cgarcia/miniconda3/envs/PUC/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/home/cgarcia/miniconda3/envs/PUC/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/home/cgarcia/miniconda3/envs/PUC/lib/python3.8/site-packages/service_streamer/service_streamer.py", line 154, in run_forever
handled = self._run_once()
File "/home/cgarcia/miniconda3/envs/PUC/lib/python3.8/site-packages/service_streamer/service_streamer.py", line 189, in _run_once
self._send_response(client_id, task_id, request_id, model_outputs[i])
KeyError: 0
Hi!
Is it possible for
ThreadedStreamer
to return a dictionary instead of a list? I'm doingand I'd like that the function
batch_prediction
returned a dictionary since my model does multitask training and I have the result of multiple tasks, not just one.When I try it, I got the following error:
Thank you!