Closed nktice closed 1 month ago
I have this exact error also. End up downgraded back to v0.20
Yeah, it's a regression. I overlooked the case where the streaming generator is used without stop conditions. I'll have a fix out soon. In the meantime, if you call generator.set_stop_conditions([])
it should initialize properly and work as before.
Thank u very much for sharing the temporary fix.
I tried new version, and it appears to have resolved the issue above. Thank you for your work... it looks like everything works now as one would expect.
Updating my guide, I'm now getting this error, so I thought I'd write... https://github.com/nktice/AMD-AI/blob/main/ROCm6.0.md
With ExLlamaV2 0.0.21 - The model loads fine, but when I try a query, I get this error - [ I checked another loader, and Oobabooga TGW's built in Exllamav2_HF works fine, and answers queries... ]