I am running the training and everything seems to be working fine, except that I am suspicious of the training time. I have NVIDIA GeForce GTX 1080, and it takes around 30 sec to run 10 iterations. An epoch will have 60 000 iterations, yes? That means that 2 epochs will take 100 hours total. Is it normal training time for this model and my hardware?
I was afraid that maybe I am not using my GPU only CPU for some reason. But style.py defines device as gpu:0, and I am not sure if it would give me an error if it couldn't access it.
(Not a programmer, so sorry if I am not specific enough! Thanks a lot for help!)
I am running the training and everything seems to be working fine, except that I am suspicious of the training time. I have NVIDIA GeForce GTX 1080, and it takes around 30 sec to run 10 iterations. An epoch will have 60 000 iterations, yes? That means that 2 epochs will take 100 hours total. Is it normal training time for this model and my hardware?
I was afraid that maybe I am not using my GPU only CPU for some reason. But style.py defines device as gpu:0, and I am not sure if it would give me an error if it couldn't access it. (Not a programmer, so sorry if I am not specific enough! Thanks a lot for help!)