Open mhtrinh opened 3 months ago
Hey @mhtrinh, have you observed the same slowdown with the newer versions of ClearML? The most recent one is 1.14.4
Yes, this happen also with the current version 1.14.4, as 2x slower.
Note : this may be specific to fastai as we have another network based on yolov5 and this is not happening
Hi @mhtrinh ! It looks like calculating the metrics that ClearML reports may take a long time. We will try to improve performance.
In the meantime, you could disable fastai
bindings using auto_connect_frameworks={"fastai": False}
in Task.init
Hi @mhtrinh ! We will release a fix for this issue in the next clearml release clearml==1.16.0
Training out model which is based on FastAI is taking 2x longer with Clearml 1.13.0 compare to 1.12.2
There are no error or warning
I cannot share our code. Here is the requirements.txt of the virtualenv:
Simply
pip install clearml==1.12.2
andpip install clearml==1.13.0
and re-run the same code.OS: openSUSE Leap 15.4