DAMO-NLP-SG / VideoLLaMA2

VideoLLaMA 2: Advancing Spatial-Temporal Modeling and Audio Understanding in Video-LLMs
Apache License 2.0
871 stars 60 forks source link

Multi-round chat in CLI mode #14

Open SeanChenxy opened 5 months ago

SeanChenxy commented 5 months ago

Hi, thanks for sharing your codes. Is there any way for multi-round chat in CLI inference?

lixin4ever commented 5 months ago

Yes sure. We are working on this, plz stay tuned.

DenisSergeevitch commented 5 months ago

@SeanChenxy can you please share your CLI inference command? It looks like mine does not work properly: !python /content/VideoLLaMA2/videollama2/serve/cli.py --model-path DAMO-NLP-SG/VideoLLaMA2-7B-16F --image-file /content/test.mp4

SeanChenxy commented 5 months ago

@SeanChenxy can you please share your CLI inference command? It looks like mine does not work properly: !python /content/VideoLLaMA2/videollama2/serve/cli.py --model-path DAMO-NLP-SG/VideoLLaMA2-7B-16F --image-file /content/test.mp4

Hi, I used the inference code provided in README.

LiangMeng89 commented 2 days ago

Hi, thanks for sharing your codes. Is there any way for multi-round chat in CLI inference?

Hello,I'm a phD student from ZJU, I also use videollama2 to do my own research,we create a WeChat group to discuss some issues of videollama2 and help each other,could you join us? Please contact me: WeChat number == LiangMeng19357260600, phone number == +86 19357260600,e-mail == liangmeng89@zju.edu.cn.