ubc-subbots / sound-localization-simulator

0 stars 2 forks source link

MUSIC Experiment with Depth Sensor #22

Open dvirhilu opened 2 years ago

dvirhilu commented 2 years ago

Compare the effectiveness of sweeping over the (r,phi) parameter space vs. sweeping across the (phi,theta) parameter space. These can be controlled by toggling the use_depth_sensor flag in the MUSIC stage initialization. If you set use_depth_sensor flag to true, the stage will grab the z values from the depth sensor and then iterate over r and phi to compute the time delays. In the case that the depth sensor is not used, the stage will iterate over phi and theta.

Note that if you are using the (r,phi) parameter space, you also need to specify the constraints on r at MUSIC initialization.

You can create a new experiment which builds on music/default_music_exp.py

Some things to consider when running the experiment:

  1. Make sure the play around with the depth sensor uncertainty in global_vars. Our depth sensor datasheet can be found in this link: https://bluerobotics.com/store/sensors-sonars-cameras/sensors/bar30-sensor-r1/
  2. If you are sweeping across phi, try it out for various values of theta and vise versa
  3. Try to play around with the MUSIC parameter sweep resolution
  4. Try to play around with different hydrophone geometries* and see which geometries might perform better
  5. Make sure to run it for all four frequencies: 25, 30, 35, 40 kHz (run experiment four separate times with different frequencies)

If you are running into any issues: Message Dvir