ApolloAuto / apollo

An open autonomous driving platform
Apache License 2.0
25.12k stars 9.7k forks source link

What is the difference between Dreamland and Dreamview? #6569

Closed cdemirsoy closed 5 years ago

cdemirsoy commented 5 years ago

Hi there again,

I am recently asking many question but I think it will be helpful for some learners like me.

So, we already know that Dreamview is only visualizing the recorded data in the offline case and in the real-time case it is visualizing the perceived data and produced output by Apollo modules.

In the Azure platform, we can actually modify the code and see if it passes some tests. For this reason, I wanted to ask how the testing is actually done in Dreamland? Does it also test the control output? If so, then is it actually a complete simulation platform which is used to develop Apollo Autonomous Driving platform?

Any additional information will be very useful!

Best Regards, Canberk Demirsoy

gaigaidevelop commented 5 years ago

Deleting data other than sensor data and using sensor data alone is a method。

ーーーーーーーーーーーーーーーdeleteーーーーーーーーーーーーーーーーーーーーーーーーーーー topics: /apollo/canbus/chassis 3490 msgs : pb_msgs/Chassis
/apollo/control 3489 msgs : pb_msgs/ControlCommand
/apollo/localization/msf_gnss 35 msgs : pb_msgs/LocalizationEstimate /apollo/localization/msf_lidar 175 msgs : pb_msgs/LocalizationEstimate /apollo/localization/msf_status 6535 msgs : pb_msgs/LocalizationStatus
/apollo/localization/pose 6277 msgs : pb_msgs/LocalizationEstimate /apollo/perception/obstacles 348 msgs : pb_msgs/PerceptionObstacles
/apollo/perception/traffic_light 105 msgs : pb_msgs/TrafficLightDetection /apollo/planning 349 msgs : pb_msgs/ADCTrajectory
/apollo/prediction 348 msgs : pb_msgs/PredictionObstacles
ーーーーーーーーーーーーーーーdeleteーーーーーーーーーーーーーーーーーーーーーーーーーーー

         /apollo/sensor/conti_radar            465 msgs    : pb_msgs/ContiRadar           
         /apollo/sensor/gnss/best_pose          35 msgs    : pb_msgs/GnssBestPose         
         /apollo/sensor/gnss/corrected_imu    3490 msgs    : pb_msgs/Imu                  
         /apollo/sensor/gnss/gnss_status        35 msgs    : pb_msgs/GnssStatus           
         /apollo/sensor/gnss/imu              6934 msgs    : pb_msgs/Imu                  
         /apollo/sensor/gnss/ins_stat           35 msgs    : pb_msgs/InsStat              
         /apollo/sensor/gnss/odometry         3490 msgs    : pb_msgs/Gps                  
         /apollo/sensor/gnss/rtk_eph            66 msgs    : pb_msgs/GnssEphemeris        
         /apollo/sensor/gnss/rtk_obs            69 msgs    : pb_msgs/EpochObservation     
         /tf                                 10398 msgs    : tf2_msgs/TFMessage
cdemirsoy commented 5 years ago

Hi there,

Thanks for the answer! So we input only sensor values to the simulation and then Autonomous Driving modules generate outputs with respect to them.

Thank you, Canberk Demirsoy

unacao commented 5 years ago

This could be one method. If the raw sensor data is provided as the only input data, then perception, prediction, planning modules can be hooked for testing. In the current Azure environment, perception and routing output are provided as the input to test prediction and planning modules. Different dynamic models can be plugged in for future releases.

natashadsouza commented 5 years ago

Closing this issue as it appears to be resolved.