Open WonderSeven opened 5 years ago
Is there anyone find the same issue or could answer for me, thanks a lot!
In the script common/wrappers.py , the state.shape is changed to (3, 160, 210) when the function wrap_pytorch() is called. Hope this answers your question.
Thanks for your answer. This really help me solve the problem.
This mainly causes the size error like ''Calculated padded input size per channel: (160 x 3). Kernel size: (8 x 8). Kernel size can't be greater than actual input size...'', so why not adjust the state's dim,maybe the code is :state = np.transpose(state, (2, 0, 1))or adjust the Net, but we may need to pay attention to this. 0.0