hustvl / Vim

[ICML 2024] Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model
Apache License 2.0
2.56k stars 160 forks source link

Mamba Tokenizer loading from a hard coded path #2

Closed vaibhavBh-0 closed 4 months ago

vaibhavBh-0 commented 5 months ago

https://github.com/hustvl/Vim/blob/5c3d4ad4b56934740fe0cb8736f3381cbe8091df/mamba/benchmarks/benchmark_generation_mamba_simple.py#L36

Edit: Is the tokenizer used different from the one below? tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")

Unrealluver commented 5 months ago

Hi ! vaibhavBh-0,

You just need to install mamba which provides the bidirectional mamba implementation. The scripts for benchmarking are not used here. :)