Commit 8c221ae277d33fe8a4634f8c414e961ae6e149b4 introduces a bug that causes TORCH_SPLIT to not work when using the RWKV-4-Pile-7B-Instruct-test2-20230209.pth model. Switching to TORCH works fine.
Code to reproduce (must download model first of course):
import torch
from rwkvstic.load import RWKV
from rwkvstic.agnostic.backends import TORCH_SPLIT
runtimedtype = torch.float32
dtype = torch.bfloat16 # torch.float32, torch.float64, torch.bfloat16
useGPU = True # False
model = RWKV("RWKV-4-Pile-7B-Instruct-test2-20230209.pth", mode=TORCH_SPLIT, useGPU=useGPU, runtimedtype=runtimedtype, dtype=dtype)
model.loadContext(newctx='anything at all') # this line crashes
Error message below:
❯ python basic.py
init RWKVOPS, from super
(160, 4096)
[?] Which devices would you like to use?:
[ ] cpu
> [X] cuda:0
[ ] cuda:1
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 582/582 [00:01<00:00, 364.91it/s]
0%| | 0/301 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/home/XXXXXXX/chatbot/basic.py", line 78, in <module>
main()
File "/home/XXXXXXX/chatbot/basic.py", line 55, in main
model.loadContext(newctx=init_prompt)
File "/home/XXXXXXX/chatbot/packages/rwkvstic/src/rwkvstic/rwkvMaster.py", line 66, in loadContext
ctx, state = loadContext(
File "/home/XXXXXXX/chatbot/packages/rwkvstic/src/rwkvstic/rwkvMaster.py", line 16, in loadContext
o = model.forward([x[-1]], statex)
File "/home/XXXXXXX/chatbot/packages/rwkvstic/src/rwkvstic/agnostic/agnosticRwkvLeg.py", line 111, in forward
ops.processEmbed(ops.getIndex(self.emb, x)),
File "/home/XXXXXXX/chatbot/packages/rwkvstic/src/rwkvstic/agnostic/backends/base.py", line 69, in <lambda>
self.getIndex = lambda x, y: x[y[-1]]
IndexError: list index out of range
Commit 8c221ae277d33fe8a4634f8c414e961ae6e149b4 introduces a bug that causes TORCH_SPLIT to not work when using the RWKV-4-Pile-7B-Instruct-test2-20230209.pth model. Switching to TORCH works fine.
Code to reproduce (must download model first of course):
Error message below: