Open mdouze opened 6 years ago
Hi @mdouze,
Thanks for your interest in our research.
I would like to reproduce the recall@1 vs. time plots in the LSQ++ paper. Thanks for the open-sourcing of the corresponding code! Installing Julia 0.7 and the Rayuela.jl was relatively painless. However, it is not obvious what the entry point in the code should be.
Sorry, this is totally my fault, since I have not had the time to build a complete demo for end users. I'll work on this and by the end of this week post an update.
Cheers,
Hi again @mdouze,
I managed to fix the particular demo in your gist, and you should be able to run it with Julia 1.0 -- apparently the language changed waaay more than I anticipated. I ran my ECCV experiments on Julia 0.6...
I'll keep working on fixing other demos and keep you posted. Hopefully won't take too long :)
Thanks for your patience,
Update: Pushed a few changes that have fixed the PQ, OPQ, RVQ and ERVQ demos. It'll take me a few more days to fix the LSQ and LSQ++ ones.
Thanks a lot @una-dinosauria! Of course I am mostly interested by the LSQ and LSQ++ implementations. So far I have used the LSQ implementation in https://github.com/una-dinosauria/local-search-quantization, which is fine but I did not manage to install Julia 6.4 with GPU support.
Of course I am mostly interested by the LSQ and LSQ++ implementations.
I understand. Sorry about that! I pushed a few changes to fix Chain quantization (GPU support included) today. Hopefully by tomorrow night I'll have LSQ/LSQ++ working again.
So far I have used the LSQ implementation in https://github.com/una-dinosauria/local-search-quantization, which is fine but I did not manage to install Julia 6.4 with GPU support.
Julia 0.6.4 is quite a pain to set up with CUDA, because it requires compiling Julia from source. This is no longer the case in 1.0, so I strongly suggest moving to that -- or to 1.0.1, released today.
Cheers and thanks again you for your patience,
Julia does not seem to be backward compatible, so Julia 1.x is not an option for the local-search-quantization repo. This is why I appreciate that you do the effort for the this repo!
Cheers
Managed to fix LSQ today, except that I'm running into OOM errors in the GPU, probably due to https://github.com/JuliaGPU/CuArrays.jl/issues/152. I'm pretty sure it's related to the memory pool in CuArrays.jl because I distinctly remember having to downgrade my package version to avoid the issue. I kinda hoped someone else would run into it and that it would get fixed In The Future When All Is Well™ -- yet here we are!
Unfortunately, upstreaming my use case to Cuarrays.jl is not trivial. So either I'll wait for an answer from the repo maintainer or move to a simpler CUDAdrv.jl-only approach, which has a simple memory management pipeline (see https://discourse.julialang.org/t/freeing-memory-in-the-gpu-with-cudadrv-cudanative-cuarrays/10946/3).
Thanks! Will try it out... Presumably the "demo and data" section of the README is incomplete, since it shows how to get the data, but not how to run the demo!
Yep! Clarified the README a little bit. Hopefully that will be enough to run LSQ. Going to bed. I'll keep cleaning up/debugging tomorrow.
Hi @mdouze,
Sorry for the delay and thanks again for your patience
I've updated the code, README.md
and respective demos to reproduce the main results of our ECCV'18 paper on SIFT1M
(train/query/base protocol) and LabelMe
(query/base) protocol.
As is, the code now will include LSQ and LSQ++ (SR-C and SR-D) examples, running them on the GPU and with fast codebook update -- the best results in our paper.
The demo will also store the results on HDF5
files that you can inspect and plot later.
I've tested this on a machine with a TitanXp GPU. I keep getting OOM errors on a GTX 1080, and I've raised the issue with the CUDA*.jl
maintainers. Hopefully we'll have progress on this soon, but in the meantime please use a GPU with 12GB of RAM to run the demos.
Please let me know if you run into any issues.
PS: I have some things left to do to achieve push-button reproduction of the paper; I've opened a issue https://github.com/una-dinosauria/Rayuela.jl/issues/29 to track this. I hope to close this in the coming weeks.
Hi!
I would like to reproduce the recall@1 vs. time plots in the LSQ++ paper. Thanks for the open-sourcing of the corresponding code!
Installing Julia 0.7 and the Rayuela.jl was relatively painless. However, it is not obvious what the entry point in the code should be.
For example, trying to use a function from demo.jl gives
https://gist.github.com/mdouze/05161b06a3c524cd0955e99a378507a0
--> loading the data is fine and running the training is ok but there seems to be a missing function qerror, which makes me think that this probably not the right entry point.
Sorry, I am not familiar with Julia.
Any help is appreciated!