Open mhubii opened 2 months ago
Hi @mhubii, this issue is adressed in the new release https://github.com/ANP-Granular/ParticleTracking/releases/tag/v0.6.4 I could not realize the 'tick' solution yet mainly because of the issues with automatic PyTorch/CUDA installation on different systems. But now, depending on the installed PyTorch version (GPU or CPU), the program downloads or loads from cashe the corresponding (faster) GPU or (slower) CPU model. By default, on Linux the CUDA version is installed. On Windows, enabling the GPU version is explained here: https://particletracking.readthedocs.io/en/latest/installation/particledetection.html#behavior-on-windows Alternatively, the installer for Windows GPU version of the program is attached to the release. It is considerably larger (because it includes GPU PyTorch) and requires the CUDA 12.4 to be installed beforehand.
The models are now published and downloaded via PyTorch hub, as suggested.
I did not experiment yet with possibility to choose other existing models published on PyTorch hub, at the moment I don't know how it will work (considering that the models can have different architecture).
Is there the possibility to run this on GPU?
I just followed https://particletracking.readthedocs.io/en/latest/RodTracker/RodTracker.html#automated-detection-of-particles
Would it be possible to add a tick box GPU? Inference takes a long time on CPU.
Furthermore, I don't see how to select models through the GUI, could this be achieved through the newly introduced https://github.com/ANP-Granular/ParticleTracking/issues/85
As thus, please modify
https://github.com/ANP-Granular/ParticleTracking/blob/f06fa4b2e4da78d1b8e20e8417db8fb7910fb34d/RodTracker/src/RodTracker/ui/detection.py#L255
To pull via torch hub? Might simplify and generalize things to any available model...
Could you please just download to torch's default cache? Don't like this just downloads to
.config/RodTracker
, as this is really torch functionalityRefers to https://github.com/openjournals/joss-reviews/issues/5986