Open gsrujana opened 5 years ago
Hi, the only supported batch size at the moment is one. We can add support for configurable batch size, but it is not currently a priority. The only way to process multiple streams right now is by adding multiple elements to the pipeline.
Hi, how can I change batch sizes and number of streams to be added for inference? Please let me know if there is a way to customize it. What is the current batch size for inference? Thanks!