Closed RalphAS closed 4 years ago
@andreasnoack, could look at this?
CI tests failed because of missing network connectivity of travis.
I think the NIST ftp server is either unreliable or limits the number of requests. I think all of the Matrix Market files are also in Tim Davis' depot, e.g. the failing file is in https://sparse.tamu.edu/HB/bp_200. @KlausC would you be able to take a look and see, if we can fetch the Matrix Market files from the SuiteSparse collection instead of the NIST ftp server. It would be good to get rid of all the CI failures.
An investigation of the mm
(matrix market - NIST) and sp
(suite sparse collection at tamu, also UF) gave thhe following crude picture:
mm
are also found in sp
"SPARSKIT/*/*"
with 107 files of mm
is completely missing in sp
.mm
files, in contrast to sp
.mm
and "0.098" in sp
.The last issue leads to different floating point values (rounding error size), which may be acceptable or not.
It is a decision to to be made, if we drop the access to mm
completely.
I think this database is outdated and not maintained. The last change date on the web site I found was from 2004/09.
The sp
site has been updated 2019/10 and contains now metadata in a more readable database; no need to parse the html of their site. So it would make sense to make use of these (they also contain metadata, which are currently not available, for example positive definiteness).
mm
completely? easy to domm
, but remove the test cases?sp
database? major re-write of parts (20% ?) of code @andreasnoack, would it be possible to merge PR #38, #39, #42 into master. The first one removes all network tests from CI.
Thanks for looking into the options. I don't think we should go with 2. since it will just result in the functionality breaking at some point. Eventually, I think 3. should be the solution but that, of course, requires real work. I think I'd prefer 1. over #38 unless you feel strongly that the latter is preferred.
I think I'd prefer 1. over #38 unless you feel strongly that the latter is preferred.
I think, we should do both. #38 has its own appeal, because it avoids testing the network connectivity over and over. That happens specially for local tests you run while developing, which is bad for the developer because it takes his time and resources, and it is bad for the remote server, wasting bandwidth.
My item 1. is orthogonal to #38. While #38 still performs tests with the same files as before, but uses local copies (curl file://...
instead of curl ftp::// ...
). Meaning test coverage of Julia code is as before.
@andreasnoack, I would really appreciate, if you could merge also #38 and bugfix #42 into master and eventually make a new release.
I want to start with 3., but want a clean baseline.
Update Pkg infrastructure and replace GZip with CodecZlib. This should avoid problems with system libz.