Closed mikezhang95 closed 2 years ago
Hey @mikezhang95.
(1) When starting Carla server, there is a setting -fps. Is it also used to set the time-step of simulations?
Yes, the fps
argument and fixed_delta_seconds
are basically the same, being the first more used in "normal" speech, and the inverse of the second (fps = 1 / delta seconds)
Will it be overridden by fixed_delta_seconds?
Both fps
and fixed_delta_seconds
modify the same variable at the world settings, so the last one sent to the server is the one with priority.
Is there a way to control the real time cost of this running or is it only decided by the setting of the resources (GPU, CPU) and runs as fast as possible?
Synchronous node makes the server and client work in series, so the real time cost is entirely dependent on how fast you do the world ticks, which as you said, is based on your code's and hardware efficiency. On the contrary, you can use external tools to slow down the simulation, such as a pyagem clock (used in the manual control here).
Closing the issue, but feel free to reopen it if you have any other questions
Hey @mikezhang95.
(1) When starting Carla server, there is a setting -fps. Is it also used to set the time-step of simulations?
Yes, the
fps
argument andfixed_delta_seconds
are basically the same, being the first more used in "normal" speech, and the inverse of the second (fps = 1 / delta seconds)Will it be overridden by fixed_delta_seconds?
Both
fps
andfixed_delta_seconds
modify the same variable at the world settings, so the last one sent to the server is the one with priority.Is there a way to control the real time cost of this running or is it only decided by the setting of the resources (GPU, CPU) and runs as fast as possible?
Synchronous node makes the server and client work in series, so the real time cost is entirely dependent on how fast you do the world ticks, which as you said, is based on your code's and hardware efficiency. On the contrary, you can use external tools to slow down the simulation, such as a pyagem clock (used in the manual control here).
Closing the issue, but feel free to reopen it if you have any other questions
Thanks for answering! And a followup question would be what is the time cost of world.tick()? Is that recorded in some documents?
Thanks for answering! And a followup question would be what is the time cost of world.tick()? Is that recorded in some documents?
The world.tick() tells the server to start calculating everything so it is heavily dependent of your simulation. Some of the elements that affect it the most are amount of sensors (along with their specs) and amount of vehicles present in the scene
Thanks for answering. And it also depends on the fixed_delta_seconds, right? If I use fixed_delta_seconds=0.01, and minimum sensors and vehicles, it costs 1ms for one tick(), is that as expected?
And it also depends on the fixed_delta_seconds, right?
Not sure how much the value of fixed_delta_seconds affects the tick duration but the large the number, the less ticks pero second, so the simulation goes faster.
If I use fixed_delta_seconds=0.01, and minimum sensors and vehicles, it costs 1ms for one tick(), is that as expected?
Again, it depends on a lot of things
Hello, everyone!!, How to calculate the time taken by vehicles to reach from one location to other location? For examples, at time T1, vehicle has started the journey and at time T2, vehicle has reach the destination. How to take the value of T1, T2 and the time vehicle has taken between T1 and T2.
I want to measure the time of multiple vehicles taken during the journey?
Any help??
Hi. I am using Carla 0.9.13 in synchronous mode. I have read the document about setting time-step on "https://carla.readthedocs.io/en/0.9.13/adv_synchrony_timestep/". I have a few questions regarding this time-step parameter:
(1) When starting Carla server, there is a setting
-fps
. Is it also used to set the time-step of simulations? Will it be overridden byfixed_delta_seconds
?(2) When running simulations in synchronous mode, I will call
world.tick()
to simulate. Is there a way to control the real time cost of this running or is it only decided by the setting of the resources (GPU, CPU) and runs as fast as possible?Thanks!