issues
search
arrmansa
/
Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Apache License 2.0
114
stars
12
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Where to do the pip install? Can you please make a bit elaborate readme.md
#7
indianninja2018
opened
1 year ago
1
Update README.md
#6
RichardScottOZ
opened
2 years ago
0
Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu
#5
kpx314159
closed
3 years ago
0
RuntimeError: where expected condition to be a boolean tensor, but got a tensor with dtype Float
#4
ebolam
opened
3 years ago
0
Expected all tensors to be on same device
#3
ebolam
closed
3 years ago
2
The results are much worse than with original GPT-J-6B
#2
Lissanro
opened
3 years ago
11
Make it work with the latest version of transformers
#1
arrmansa
closed
3 years ago
1