Closed vaibhavBh-0 closed 4 months ago
https://github.com/hustvl/Vim/blob/5c3d4ad4b56934740fe0cb8736f3381cbe8091df/mamba/benchmarks/benchmark_generation_mamba_simple.py#L36
Edit: Is the tokenizer used different from the one below? tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
Hi ! vaibhavBh-0,
You just need to install mamba which provides the bidirectional mamba implementation. The scripts for benchmarking are not used here. :)
mamba
https://github.com/hustvl/Vim/blob/5c3d4ad4b56934740fe0cb8736f3381cbe8091df/mamba/benchmarks/benchmark_generation_mamba_simple.py#L36
Edit: Is the tokenizer used different from the one below?
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")