Closed HysterLc closed 7 months ago
sure ,you can try . currentlly support , sd.next and lshqqytiger/stable-diffusion-webui-directml.git ,offical comfy UI with the code tweeked .other have not test yet .
i have tested it in text generation. successfully loaded model. it is very close. UDA error: the requested functionality is not supported current device: 0, in function ggml_cuda_op_mul_mat_cublas at D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:9729 cublasSgemm_v2(g_cublas_handles[id], CUBLAS_OP_T, CUBLAS_OP_N, row_diff, src1_ncols, ne10, &alpha, src0_ddf_i, ne00, src1_ddf1_i, ne10, &beta, dst_dd_i, ldc) GGML_ASSERT: D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:255: !"CUDA error" it seems need another lib named cuBLAS
sure ,you can try . currentlly support , sd.next and lshqqytiger/stable-diffusion-webui-directml.git ,offical comfy UI with the code tweeked .other have not test yet .
i have tested it in text generation. successfully loaded model. it is very close. UDA error: the requested functionality is not supported current device: 0, in function ggml_cuda_op_mul_mat_cublas at D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:9729 cublasSgemm_v2(g_cublas_handles[id], CUBLAS_OP_T, CUBLAS_OP_N, row_diff, src1_ncols, ne10, &alpha, src0_ddf_i, ne00, src1_ddf1_i, ne10, &beta, dst_dd_i, ldc) GGML_ASSERT: D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:255: !"CUDA error" it seems need another lib named cuBLAS
i replaced two libs from your repository then it worked! both text and stable diffusion
That's great. Now webui forge for amd also available. You may try if you want to experience another version of SD.
Of course.i will try it next weekend
sure ,you can try . currentlly support , sd.next and lshqqytiger/stable-diffusion-webui-directml.git ,offical comfy UI with the code tweeked .other have not test yet .
i have tested it in text generation. successfully loaded model. it is very close. UDA error: the requested functionality is not supported current device: 0, in function ggml_cuda_op_mul_mat_cublas at D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:9729 cublasSgemm_v2(g_cublas_handles[id], CUBLAS_OP_T, CUBLAS_OP_N, row_diff, src1_ncols, ne10, &alpha, src0_ddf_i, ne00, src1_ddf1_i, ne10, &beta, dst_dd_i, ldc) GGML_ASSERT: D:\a\llama-cpp-python-cuBLAS-wheels\llama-cpp-python-cuBLAS-wheels\vendor\llama.cpp\ggml-cuda.cu:255: !"CUDA error" it seems need another lib named cuBLAS
i replaced two libs from your repository then it worked! both text and stable diffusion
Excuse me, how did you get it to work? I also load the model correctly, but when generating it, it shows me a cuBLAS-wheels error. Which libraries did you replace? Thanks in advance.
sure ,you can try . currentlly support , sd.next and lshqqytiger/stable-diffusion-webui-directml.git ,offical comfy UI with the code tweeked .other have not test yet .