Closed JeongChanwoo closed 3 years ago
@JeongChanwoo I checked the test_model.py
(here) it looks like the model name is being imported directly from the model file. Could you check if the correct model file was imported?
Maybe if you are using a container image, make sure you rebuild rebuild the image so the changes that you make are available in the image.
Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale
.
Stale issues rot after an additional 30d of inactivity and eventually close.
If this issue is safe to close now please do so with /close
.
/lifecycle stale
Stale issues rot after 30d of inactivity.
Mark the issue as fresh with /remove-lifecycle rotten
.
Rotten issues close after an additional 30d of inactivity.
If this issue is safe to close now please do so with /close
.
/lifecycle rotten
Rotten issues close after 30d of inactivity.
Reopen the issue with /reopen
.
Mark the issue as fresh with /remove-lifecycle rotten
.
/close
@sesheta: Closing this issue.
After changing the model to lstm in the test_model.py file, I tried to proceed. However, when checked in mlflow, the model tag of the experiment is fixed as prophet.
I have changed to the lstm model, but I am worried that the learning and predictions are not fixed as a prophet.
Even if I changed to the Fourier transform model, the prophet model tag is displayed fixedly.