Open lastmjs opened 2 weeks ago
The script is designed to print the inference results to stdout and everything else to stderr, in case you want to pipe them separately.
There might be something wrong with your environment, do regular prints in a script not work for you?
I've just started playing with litgpt, and I first started out with some basic inference with just the CPU I believe, since I hadn't fully restarted my computer to get the drivers to work. Inference would print in the console.
Once I restarted my computer I don't believe I was able to get inference printing again. Other logging would print, but not inference. I finally cloned the repo and figured out that
file=sys.stderr
needs to be set in order for the inference to actually print to my terminal inlibgpt/generate/base.py
.I am using Ubuntu 22.04 and I've tried with multiple versions of Python with and withou a virtual env.
This is where I fixed the issue by adding
file=sys.stderr
:https://github.com/Lightning-AI/litgpt/blob/main/litgpt/generate/base.py#L238
I am not sure why enabling my drivers and having GPU inference seemed to cause this problem.