Open RudolfReiter opened 1 year ago
By computation time of algorithm are we referring to the "inference time" i.e. the time it takes the algorithms to give us the next action?
Do we want to add random walls? If so I revise my comment on #31 and say that we should add the wall parameters also to the observation space.
Yes, inference time. I had problems with getting rid of all the pytorch specific things and really just count the time for matrix multiplications. Maybe you have a good idea for how to get the best estimate for embedded computation @Erfi ?
First, I would suggest to keep the wall constant. But have it already accordingly in the environment.
Oki I can just add a timing for the predict
function of the policies and perhaps get an average of it. Will make an issue for this.
We need a script or a workflow to take one of the configurations
with options
and collect the following data