Closed claudiosv closed 1 year ago
|lmwrapper|claudio@ubuntu lmwrapper ±|llama|→ pytest -s -vv --runslow test/test_huggingface.py::test_code_llama_autoregressive test/test_huggingface.py::test_code_llama_infill test/test_huggingface.py::test_code_llama_conversation
============================================================== test session starts ===============================================================
platform linux -- Python 3.11.6, pytest-7.4.2, pluggy-1.3.0 -- /home/claudio/mambaforge/envs/lmwrapper/bin/python
cachedir: .pytest_cache
rootdir: /home/claudio/lmwrapper
plugins: cov-4.1.0
collected 4 items
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:03<00:00, 1.86s/it]
PASSED
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:04<00:00, 2.16s/it]
PASSED
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:03<00:00, 2.00s/it]
PASSED
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:04<00:00, 2.13s/it]
PASSED
=============================================================== 4 passed in 26.09s ===============================================================
Tested on Houston machine. Works more or less out the box! This blog post https://huggingface.co/blog/codellama uses
add_special_tokens=False
so I added that as an option on LmPrompts.