RidgeRun / gst-inference

A GStreamer Deep Learning Inference Framework
GNU Lesser General Public License v2.1
121 stars 29 forks source link

Custom batch size and multiple stream input. #159

Open gsrujana opened 5 years ago

gsrujana commented 5 years ago

Hi, how can I change batch sizes and number of streams to be added for inference? Please let me know if there is a way to customize it. What is the current batch size for inference? Thanks!

migueltaylor commented 5 years ago

Hi, the only supported batch size at the moment is one. We can add support for configurable batch size, but it is not currently a priority. The only way to process multiple streams right now is by adding multiple elements to the pipeline.