state-spaces / mamba

Mamba SSM architecture
Apache License 2.0
12.21k stars 1.03k forks source link

Error: running `python evals/lm_harness_eval.py ...` with not modifications #260

Open arelkeselbri opened 5 months ago

arelkeselbri commented 5 months ago

When running

python evals/lm_harness_eval.py --model mamba --model_args pretrained=state-spaces/mamba-130m --tasks lambada_openai,hellaswag,piqa,arc_easy,arc_challenge,winogrande --device cuda --batch_size 64

After a little bit, I get :

2024-03-22:11:33:09,519 INFO     [evaluator.py:362] Running loglikelihood requests
Traceback (most recent call last):
  File "/home/marcelo/mamba/evals/lm_harness_eval.py", line 39, in <module>
    cli_evaluate()
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/__main__.py", line 342, in cli_evaluate
    results = evaluator.simple_evaluate(
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/utils.py", line 288, in _wrapper
    return fn(*args, **kwargs)
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/evaluator.py", line 234, in simple_evaluate
    results = evaluate(
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/utils.py", line 288, in _wrapper
    return fn(*args, **kwargs)
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/evaluator.py", line 373, in evaluate
    resps = getattr(lm, reqtype)(cloned_reqs)
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/api/model.py", line 325, in loglikelihood
    context_enc, continuation_enc = self._encode_pair(context, continuation)
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/api/model.py", line 301, in _encode_pair
    whole_enc = self.tok_encode(context + continuation)
  File "/home/marcelo/mamba/venv/lib/python3.10/site-packages/lm_eval/models/huggingface.py", line 669, in tok_encode
    add_special_tokens = False or self.add_bos_token
AttributeError: 'MambaEvalWrapper' object has no attribute 'add_bos_token'

How to fix it?

ZonglinY commented 5 months ago

I also have encountered the same problem

Huangmr0719 commented 4 months ago

Have you solved this problem? I also have met the same problem

arelkeselbri commented 4 months ago

I did solved by updating a few packages. Setuptools, wheel and packaging among them.

Em sex., 29 de mar. de 2024 09:18, ZyKNvice @.***> escreveu:

Have you solved this problem? I also have met the same problem

— Reply to this email directly, view it on GitHub https://github.com/state-spaces/mamba/issues/260#issuecomment-2027171843, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADZM5HEG6XXSCTP3RYST2PDY2VLYDAVCNFSM6AAAAABFDOPOQWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRXGE3TCOBUGM . You are receiving this because you authored the thread.Message ID: @.***>

darwin406 commented 3 months ago

I'm having the same issue despite updating the package, has anyone solved it?

Sawyer117 commented 1 month ago

I suppose anyone encounter this error, got the old eval code from previously. For me, I solved this error by: instead of python evals/lm_harness_eval.py --model mamba Change to this: python evals/lm_harness_eval.py --model mamba_ssm

dragonBrother1 commented 1 week ago

I want to ask some questions just when execute this command : image The wrong is as followed: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb5 in position 1: invalid start byte

dragonBrother1 commented 1 week ago

I have solved this question.