Open SeanChenxy opened 5 months ago
Yes sure. We are working on this, plz stay tuned.
@SeanChenxy can you please share your CLI inference command? It looks like mine does not work properly:
!python /content/VideoLLaMA2/videollama2/serve/cli.py --model-path DAMO-NLP-SG/VideoLLaMA2-7B-16F --image-file /content/test.mp4
@SeanChenxy can you please share your CLI inference command? It looks like mine does not work properly:
!python /content/VideoLLaMA2/videollama2/serve/cli.py --model-path DAMO-NLP-SG/VideoLLaMA2-7B-16F --image-file /content/test.mp4
Hi, I used the inference code provided in README.
Hi, thanks for sharing your codes. Is there any way for multi-round chat in CLI inference?
Hello,I'm a phD student from ZJU, I also use videollama2 to do my own research,we create a WeChat group to discuss some issues of videollama2 and help each other,could you join us? Please contact me: WeChat number == LiangMeng19357260600, phone number == +86 19357260600,e-mail == liangmeng89@zju.edu.cn.
Hi, thanks for sharing your codes. Is there any way for multi-round chat in CLI inference?