Open xiafawu opened 3 years ago
Just a quick update. I think some of the issues are related to planning not able to generate safe trajectories when routing response updates. Not sure if this is potentially bug-related.
It would be really helpful if someone could take a look and hopefully we can narrow down the cause of the violations. The violations seem really interesting and might be dangerous in reality, and I hope it can help identify potential issues and improve Apollo's safety.
Any update on this?
I have studied your test. In my opinion, you have added some obstacles that do not conform to the laws of physics.
This will cause the car to brake, because sim_control is a perfect model and is running according to the planned trajectory, so it is recommended that you analyze the planning and obstacles's relation case by case.
@daohu527 can you please elaborate on what you mean by "not confirming to the laws of physics"? these obstacles are generated by the "replay_perception" script under tools/perception in Apollo and they are traveling at a reasonable speed and with reasonable dimensions.
At least the predicted route and current speed of the obstacles in the video are not along the road, and they move frame by frame like weird.
@daohu527 doesn't that imply that there's an issue with the prediction though? or is the prediction wrong due to the fact that the velocity direction is wrong?
It depends on the data you are playing, and you have to match the corresponding map
@daohu527 I ensured that the correct maps are corresponding to the start and end location for both obstacles and the adc.
As I mentioned above: 1) I used replay_perception.py which s provided by Apollo developers to produce perception messages for the obstacles. So if the velocity direction is not correct, this indicates that there is a bug with that script. right?
2) The obstacles generated are driving along the roads most of the time.I would also argue that in few cases, where a pedestrian was crossing the street, this is still a valid case that can happen in real life. For example, an Uber SUV killed a woman in 2018 because it did not recognize jaywalking. Hence, Shouldn't Apollo account for this type of unexpected behavior?
Thank you for your response!
I don’t know if what you said is consistent with me, describe that the problem may occur, it is best to have a specific log to explain
below is some example in vedio
@daohu527 You mentioned that the obstacles look "weird" and not confirm to the laws of physics. So I asked you to elaborate on what you mean by that, then you said they are not driving along the road. The obstacles are driving along the roads in the screenshots you used above, and in some few cases where the obstacles did not drive along the road they, shouldn't Apollo still account for this unexpected behavior? i.e. is jaywalking something that apollo avoids?
I see that the velocity direction is off in the screenshots you used. These obstacles were generated using /apollo/modules/tools/replay_perception.py. So the script needs to be updated since the velocity direction is not correct. The predicted trajectory is not generated by the script, I think it might be off as a product of the wrong velocity direction.
you can try running the replay_perception script to reproduce similar obstacles.
It seems that something caught the issue with replay_perception sometime in March: https://github.com/ApolloAuto/apollo/pull/13617
System information
Hi,
While experimenting with sim control, I encountered some potential safety violations. I wonder what the root causes for the violations could be. Below are the links to the recordings. Thanks!
video1 video2 video3 video4