mit-han-lab / TinyChatEngine

TinyChatEngine: On-Device LLM Inference Library
https://mit-han-lab.github.io/TinyChatEngine/
MIT License
751 stars 73 forks source link

How to run vila with multiple understanding? #115

Open yg1988 opened 2 months ago

yg1988 commented 2 months ago

The vila bash script seems only provide chat executable with one image parameter. ./chat VILA_2.7B INT4 5 $image_path How to use vila's multiple understanding by providing the model with more image paths?