Closed Poddiachyi closed 5 years ago
No, unfortunately, due to complex model dynamics these operations are not easy to parallelize on a GPU at the moment. Typical approach that participants take is to run simulations on multiple CPUs and the network training on GPUs.
Hello.
Is there a way to do environment computation on a GPU? As far as I understand there is no problem with training an agent but the environment is a bottle neck.
Sorry if there was such question before, I couldn't find it.