Closed geweihgg closed 2 months ago
I find that testing models via API is not consistent with testing models directly, especially the prompt. API do not support custom prompt now. Will VLMEvalKit support custom prompt for API?
testing API:https://github.com/open-compass/VLMEvalKit/blob/main/vlmeval/inference.py#L33 testing model:https://github.com/open-compass/VLMEvalKit/blob/main/vlmeval/inference.py#L111
Hi, @geweihgg , We have added custom prompt for APIs now. Please have another try.
I find that testing models via API is not consistent with testing models directly, especially the prompt. API do not support custom prompt now. Will VLMEvalKit support custom prompt for API?
testing API:https://github.com/open-compass/VLMEvalKit/blob/main/vlmeval/inference.py#L33 testing model:https://github.com/open-compass/VLMEvalKit/blob/main/vlmeval/inference.py#L111