JuliaAttic / CUBLAS.jl

Julia interface to CUBLAS
Other
26 stars 19 forks source link

CUBLAS.gemm with CudaArrays is gone #34

Closed una-dinosauria closed 7 years ago

una-dinosauria commented 7 years ago

I have some code that uses CUBLAS.gemm with CudaArrays. After updating today I am getting this error:

ERROR: LoadError: error in running finalizer: CUDAdrv.CuError(code=201, meta=nothing)
MethodError: no method matching gemm(::Char, ::Char, ::Float32, ::CUDArt.CudaArray{Float32,2}, ::CUDArt.CudaArray{Float32,2})
Closest candidates are:
  gemm(::Char, ::Char, ::Float32, !Matched::CUDAdrv.CuArray{Float32,2}, !Matched::CUDAdrv.CuArray{Float32,2}) at /home/julieta/.julia/v0.6/CUBLAS/src/blas.jl:929
  gemm(::Char, ::Char, ::Float32, !Matched::Union{Base.ReshapedArray{Float32,2,A,MI} where MI<:Tuple{Vararg{Base.MultiplicativeInverses.SignedMultiplicativeInverse{Int64},N} where N} where A<:DenseArray, DenseArray{Float32,2}, SubArray{Float32,2,A,I,L} where L} where I<:Tuple{Vararg{Union{Base.AbstractCartesianIndex, Int64, Range{Int64}},N} where N} where A<:Union{Base.ReshapedArray{T,N,A,MI} where MI<:Tuple{Vararg{Base.MultiplicativeInverses.SignedMultiplicativeInverse{Int64},N} where N} where A<:DenseArray where N where T, DenseArray}, !Matched::Union{Base.ReshapedArray{Float32,2,A,MI} where MI<:Tuple{Vararg{Base.MultiplicativeInverses.SignedMultiplicativeInverse{Int64},N} where N} where A<:DenseArray, DenseArray{Float32,2}, SubArray{Float32,2,A,I,L} where L} where I<:Tuple{Vararg{Union{Base.AbstractCartesianIndex, Int64, Range{Int64}},N} where N} where A<:Union{Base.ReshapedArray{T,N,A,MI} where MI<:Tuple{Vararg{Base.MultiplicativeInverses.SignedMultiplicativeInverse{Int64},N} where N} where A<:DenseArray where N where T, DenseArray}) at linalg/blas.jl:1039
  gemm(::Char, ::Char, !Matched::Float64, !Matched::CUDAdrv.CuArray{Float64,2}, !Matched::CUDAdrv.CuArray{Float64,2}) at /home/julieta/.julia/v0.6/CUBLAS/src/blas.jl:929

Sorry, what happened to the old gemm? And what are these new CuArrays (not CudaArrays)?

MikeInnes commented 7 years ago

This is happening because we moved the backend CUDA library from CUDArt to CUDAdrv. That will be a lot more robust but there may be a couple of small changes needed.

In your case you should just use CUDAdrv.CuArray in place of CUDArt.CudaArray, and things should just work.

una-dinosauria commented 7 years ago

I see. Thanks for taking care of JuliaGPU, these packages are in dire need of some love.

I've seen that you're deprecating CUDArt too. I'm wondering, would there be interest in making a guide as that in the README of CUDArt for CUDAdrv? (ie, a quick "this is how you can call your cuda functions with this package" such as this one: https://github.com/JuliaGPU/CUDArt.jl/blob/master/README.md#usage)

I spent today figuring that out and I think it could save others some time, but I understand this wouldn't be super useful if API changes are coming to CUDAdrv soon.

MikeInnes commented 7 years ago

Yeah, lots of old and forgotten code around here, but also some useful stuff that we can make good use of going forward.

I don't think CUDAdrv is about to undergo huge API changes (cc @maleadt just in case), so any doc improvements would be really welcome.

I don't know your use case, but you might also be interested in CuArrays, which provides more general array functionality. Not all of BLAS is wrapped yet but I'll happily add stuff if needed.