-
### Question
Table 1 in this paper showed that the accuracy of SeedBench on vanilla LLaVA-1.5-7B is 60.5. However, when I tried to reproduce this one, I got 66.2 which is higher than that.
_No respo…
-
Thanks for your contribution.
I have already downloaded the videochatgpt dataset to a directory by `huggingface-cli download lmms-lab/VideoChatGPT --repo-type dataset --local-dir .`, and I have `expo…
-
I am tring to reproduce some tasks' result on llava-1.6-mistral-7b, but found large gap on AI2D, ChartQA and InfoVQA. The lmms-eval version I ues is `0.2.0`.
My script:
```shell
python3 -m accele…
-
Got a crash in LMMS with the Reverse Delay (revdelay_1605.so)
idelay_samples can be 0 and this is not good with modulus (%) operation.
https://github.com/swh/ladspa/blob/02bda232041380c28464149457…
-
Across different LMMs the max new token is different .
I believe we should have a consistent MAX_NEW_TOKENS across the project, set to 512 or 1024
If it makes sense, I can create a PR to modify al…
-
https://lmms.io/forum/viewtopic.php?t=5871
In the first reply "Some, like Synth1 and DVSGuitar can be found from the Tested VST database at https://lmms.io/documentation/Tested_VSTs Google should f…
-
[Error importing reka,flashattention when evaluate llava]
when I try to evaluate the llava-v1.5-7b, I use this command:
python -m accelerate.commands.launch \
--num_processes=1 \
-m lmm…
-
### System Information
Windows 10, Intel(R) Core(TM) i7-6770HQ CPU @ 2.60GHz 2.59 GHz
### LMMS Version(s)
gd703f3915
### Most Recent Working Version
_No response_
### Bug Summary
…
-
### System Information
Apple M1 Pro 14.6.1 (23G93)
### LMMS Version(s)
lmms-1.3.0-alpha.1.690+gd703f3915-mac14.6-arm64.dmg
### Most Recent Working Version
_No response_
### Bug Summary
Hello,
…
-
Thank you for this nice tool.
I was wondering if susie_rss() could be safely applied to GWAS summary statistic (SS) derived from a linear mixed effect model, the SS being the fixed effects of SNPs …