Lightning-AI / litgpt

Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.
https://lightning.ai
Apache License 2.0
6.85k stars 726 forks source link

fabric.print only works on sys.stderr, does not print inference result #1384

Open lastmjs opened 2 weeks ago

lastmjs commented 2 weeks ago

I've just started playing with litgpt, and I first started out with some basic inference with just the CPU I believe, since I hadn't fully restarted my computer to get the drivers to work. Inference would print in the console.

Once I restarted my computer I don't believe I was able to get inference printing again. Other logging would print, but not inference. I finally cloned the repo and figured out that file=sys.stderr needs to be set in order for the inference to actually print to my terminal in libgpt/generate/base.py.

I am using Ubuntu 22.04 and I've tried with multiple versions of Python with and withou a virtual env.

This is where I fixed the issue by adding file=sys.stderr:

https://github.com/Lightning-AI/litgpt/blob/main/litgpt/generate/base.py#L238

I am not sure why enabling my drivers and having GPU inference seemed to cause this problem.

carmocca commented 1 week ago

The script is designed to print the inference results to stdout and everything else to stderr, in case you want to pipe them separately.

There might be something wrong with your environment, do regular prints in a script not work for you?