RWKV / rwkv.cpp

INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
MIT License
1.37k stars 90 forks source link

AttributeError: 'Tensor' object has no attribute 'untyped_storage' (PyTorch version) #80

Closed moxiegushi closed 1 year ago

moxiegushi commented 1 year ago

i'm running chat_with_bot.py using torch==1.13, bug some error occur:

Loading 20B tokenizer System info: AVX = 1 | AVX2 = 1 | AVX512 = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 | Loading RWKV model Processing 185 prompt tokens, may take a while Traceback (most recent call last): File "rwkv/chat_with_bot.py", line 115, in <module> process_tokens(split_last_end_of_line(tokenizer.encode(init_prompt).ids)) File "rwkv/chat_with_bot.py", line 81, in process_tokens logits, state = model.eval(_token, state, state, logits) File "/root/RWKVcpu/rwkv/rwkv_cpp_model.py", line 102, in eval state_out.untyped_storage().data_ptr(), AttributeError: 'Tensor' object has no attribute 'untyped_storage'

looks like there are some version conflicts, need help

LoganDark commented 1 year ago

Yeah that version of torch was superseded over 2 years ago, you need to properly install the most recent version of torch. Look at the requirements.txt it calls for the latest version of torch, not the old 1.13 version, the new 2.0 version.

(Technically it does not call for specifically the 2.0 version, but it does not call for the old outdated version, so you should be using the latest.)

The reason for this is because torch 2.0 was raising annoying deprecation errors in console for things that worked fine on 1.13, so we upgraded

LoganDark commented 1 year ago

Technically it doesn't have to be this way though so I suppose we should change the code to be compatible with both versions

saharNooby commented 1 year ago

I suppose we should change the code to be compatible with both versions

Agree, let's keep this open. I or somone else will add something like if hasattr(tensor, 'untyped_storage') ...

LoganDark commented 1 year ago

It could be as simple as (tensor.untyped_storage or tensor.storage)()