Open Huy0110 opened 8 months ago
Currently, I don't see support for multiple streams with batched inference. In case I have many streams to process, I think batch processing will be faster and more optimal than creating many yolov8 objects.
Currently, I don't see support for multiple streams with batched inference. In case I have many streams to process, I think batch processing will be faster and more optimal than creating many yolov8 objects.