Closed farzadab closed 3 months ago
OK, so the issue here was ddp_utils_test.py, not infer_test.py. That said, I think this is worth keeping since it allows external PRs to access the Llama 3 tokenizer, which would otherwise fail due to not having the HF_TOKEN set.
Still seeing failures here even with local tokenizer/processor. I think something is getting wedged during test shutdown, kind of frustrating to debug since everything works fine locally 🙁