Closed dlcole3 closed 2 years ago
I did not realize that having a test on CUDA.jl
would cause the tests to error out (though this makes a lot of sense). While these are removed form the test, running LQDynamicModel
on my own machine with CuArray
s did work and resulted in H
and J
being returned as CuArray
s.
Merging #27 (4eec098) into main (9c6aa9b) will increase coverage by
0.11%
. The diff coverage is98.87%
.
@@ Coverage Diff @@
## main #27 +/- ##
==========================================
+ Coverage 97.60% 97.71% +0.11%
==========================================
Files 1 1
Lines 668 701 +33
==========================================
+ Hits 652 685 +33
Misses 16 16
Impacted Files | Coverage Δ | |
---|---|---|
src/DynamicNLPModels.jl | 97.71% <98.87%> (+0.11%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 9c6aa9b...4eec098. Read the comment docs.
yes, we will need to set up a self-hosted runner. I'll set it up later today
you can set up runtests.jl
in a way that it runs CUDA test only if one can detect an nvidia GPU. You can use CUDA.has_cuda_gpu()
to check it
Updated src code to handle non-default matrix types, mainly by removing
zeros
and usingsimilar
instead. Updatedruntests.jl
to test onFloat32
types andCuArray
types.These changes also enable forming the dense formulation as
CuArray
s, so that theH
,J
,lcon
,ucon
,lvar
, anduvar
are returned asCuArray
s when the original arrays passed toLQDynamicModel
areCuArray
s.