Closed dotXem closed 10 months ago
Hi @dotXem! Thanks for reporting the issue - I can confirm that I'm able to repro it with the steps you've provided. Let me get back to you with a root cause and fix soon! Apologies that this didn't work as expected out of the box.
@dotXem I've found the issue and I've updated the notebook(s) on the Ludwig README including the one you're trying - are you able to give it a quick run through to see if the issue is fixed?
For context, it seems like the way we were setting UTF8 encoding as the default wasn't interplaying nicely with torch 2.1, and it seems like we weren't using the recommended way. I just updated it to use the preferred method and it seems to work well.
This is what I changed
Current:
import locale; locale.getpreferredencoding = lambda: "UTF-8"
New:
import locale; locale.setlocale(locale.LC_ALL, 'en_US.UTF-8')
Let me know how it goes!
It's working ! Thanks for the quick fix !
Le lun. 15 janv. 2024, 19:07, Arnav Garg @.***> a écrit :
@dotXem https://github.com/dotXem I've found the issue and I've updated the notebook(s) on the Ludwig README including the one you're trying - are you able to give it a quick run through to see if the issue is fixed?
For context, it seems like the way we were setting UTF8 encoding as the default wasn't interplaying nicely with torch 2.1, and it seems like we weren't using the recommended way. I just updated it to use the preferred method and it seems to work well.
This is what I changed
Current:
import locale; locale.getpreferredencoding = lambda: "UTF-8"
New:
import locale; locale.setlocale(locale.LC_ALL, 'en_US.UTF-8')
Let me know how it goes!
— Reply to this email directly, view it on GitHub https://github.com/ludwig-ai/ludwig/issues/3881#issuecomment-1892600865, or unsubscribe https://github.com/notifications/unsubscribe-auth/AC3IM4YWCF5YL45TJYC2TH3YOVV7RAVCNFSM6AAAAABB3S6RZCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQOJSGYYDAOBWGU . You are receiving this because you were mentioned.Message ID: @.***>
Describe the bug
The demo colab notebook for finetuning Llama-2-7b is crashing at the third runnable cell when trying to import torch.
To Reproduce
Expected behavior It should work!
Environment (please complete the following information):
(not sure if relevant)