lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
10 stars 3 forks source link

Failed legacy chat migration due to flash attention in config #180

Open SilentMrDave opened 3 weeks ago

SilentMrDave commented 3 weeks ago

I encountered an issue during the legacy chat migration process. The root cause seems to be that the migration logic is unable to handle chat configurations that have previously enabled Flash Attention.

Specifically, the presence of the following lines in your chat's configuration file triggers the failure:

"flash_attn": true,
"cache_type_v": "f16",
"cache_type_k": "f16"

As a temporary workaround, you can remove these lines from your chat's configuration file. This will allow the migration to proceed successfully.