facebookresearch / sound-spaces

A first-of-its-kind acoustic simulation platform for audio-visual embodied AI research. It supports training and evaluating multiple tasks and applications.
https://soundspaces.org
Creative Commons Attribution 4.0 International
322 stars 55 forks source link

How to plot a top down map of the room scene #115

Closed dosssman closed 9 months ago

dosssman commented 1 year ago

Greetings.

Just wanted to ask if there is any tool in either habitat-lab / habitat-sim or soundspaces repositories that allows to plot a top down or bird's eye view of a map with the actual 3D scene information, like in the papers ?

For example, the following two figures from the SS v1 and SAVi papers, respectively ? image image

I was also wondering how to recover the information about the "Path with sound" and "Path when silent" for the plots, as in the exceprt below:

image

Thank you for your time.

ChanganVR commented 1 year ago

Hi @dosssman, the first top-down map was created using meshlab by overlaying the acoustic pressure field on top of the mesh file.

And for the second visualization in SAVi, we used another 3D mesh viewer, which I believe is this one: https://gltf-viewer.donmccurdy.com/.

By "how to recover the information" do you mean how did I plot the trajectory with blue and red according to if these steps are sounding? I think I basically color-coded the trajectory according to whether there is sound in this function: https://github.com/facebookresearch/sound-spaces/blob/3768a5073fb305723676b07cb2a78b62f0d7f6c6/ss_baselines/common/utils.py#L231.

dosssman commented 1 year ago

Hello again.

My gratitude for taking the time to check upon this issue. Duly noted regarding the top down map plotting. I will looking a bit more into the tools you have suggested.

Regarding

[...] how to recover the information about the "Path with sound" and "Path when silent" for the plots, as in the exerpt below:

If my understanding is correct, in SAVI, the sound is only emitted for part of an episode, usually at the beginning, so the agent has to remember it and make association with the category of the object to be able to find it later and solve the task.

I was referring to the ability to plot the path when the sound of the target object is ringing in blue, while the rest of the path that is silent is plotted in red on the top_down_map.

I have actually used the plot_top_down_map, but it does not seem to contain anything related to sound (silent or not). Will be looking into it a bit more in the meantime, but would definitely appreciate any further insights in case you recall how it was done for the paper.

Thanks a lot for the help.

ChanganVR commented 11 months ago

@dosssman I think I hard-coded that coloring part but that code might not be in the soundspaces repo. Basically, I passed the variable of the length of the sound from the simulator class to the plotting function and the plot the path with sound with one color and the path without sound with another. This should be pretty straightforward.

dosssman commented 11 months ago

@ChanganVR Thanks a lot for the answer.

I was wondering about where one should check whether the current time step is silent or not ? Thanks again for your time.

ChanganVR commented 9 months ago

@dosssman I simply checked whether the current audio spectrogram/waveforms are all zeros. This is a bit of an artificial scenario because, in the real world, there is never complete silence. But given the assumption of the task, this worked well.

dosssman commented 9 months ago

Thanks a lot for the answer.