microsoft / ProjectAcoustics

Microsoft Project Acoustics
https://aka.ms/acoustics
Creative Commons Attribution 4.0 International
138 stars 21 forks source link

One sound source hides another sound source #69

Closed dvirsamuel closed 2 years ago

dvirsamuel commented 3 years ago

Hi,

TLDR: We are trying to create a scene where one sound source (Acoustic Component actor) hides another sound source (Acoustic Component actor). We would like to simulate the sound of the hidden object as in real life.

We have a movable cube sound source with sound X and another movable small triangle sound source with sound Y. The player is static and looks at both objects moving. The cube is bigger than the triangle, so from the player's point of view, the triangle hides behind the cube. We would expect that the cube would behave similarly to a wall and the sound of the triangle would be lower. But this is not the case. Instead, we hear the triangle the same as there is no cube at all.

Does project acoustics handles the case of two moving objects or does it know to calculate the sound between a sound source and static actors (as walls, meshes, etc)?

Thanks.

OS: UE 4.26 Windows 7 pro

MikeChemi commented 3 years ago

While Project Acoustics supports moving sound sources, it currently only supports static geometry, including any geometry attached to your sound sources. That is, any actors that are moveable do not have their geometry included in the simulation.

dvirsamuel commented 3 years ago

Thank you very much. Just to make it more clear for us: Is there any difference if one object occludes another or one object contains another object? Is there any workaround for cases in which occlusions and containing scenarios take place in a scene?

MikeChemi commented 3 years ago

I'm not sure I understand the question, sorry. Let me try to explain differently.

Project Acoustics requires a bake of the scene's geometry to simulate audio propagation in that scene. The bake is performed against a snapshot of the scene's geometry. If any of the geometry in the scene moves for any reason, you'll need to take a new snapshot of the scene's geometry, bake against that new snapshot, and then you'll be able to hear the results of the geometry change.

AvivSham commented 3 years ago

@MikeChemi I have another question, I will try to make it clear as possible. Let's say we have a room with two balls both posses sound sources, in addition we have a microphone in the middle of the room. The movement of the balls is deterministic and predefined the "viewer"/"player" can't affect the scene in any way.

If I understand you correctly such mode is not supported by project acoustics?

nikunjragh commented 3 years ago

@AvivSham: Dynamic sources are supported. Dynamic geometry is not [except doors]. Every source's sound will interact with all static geometry in the scene and the sum total over all sources will be heard by the listener. In the case you described, IF you want that when one of the balls hides the other one, the sound of the hidden ball should be occluded by the ball geometry in front of it - we do not support that. Dynamic sources are OK, Dynamic listeners are OK, Dynamic portals are OK. Dynamic occluders are not suported currently. The issue has nothing to do with two sound sources or their motion. Just take one static sound source and pick up and put a box in front of the sound source, the box will not occlude until you rebake the new scene with the box in the new position.

AvivSham commented 3 years ago

Got it. Is there any workaround for this? I'm trying to understand how computer games work as most of the scene (at least in now-days games) are cluttered with moving objects, each with different noise and some occlude each other (e.g. GTA). Can you please help us understand how can we achieve what we are looking for?

MikeChemi commented 3 years ago

Is there any workaround for this?

Not with Project Acoustics alone, no. This is a new feature that we are still in the research phase for a scalable solution.

... how computer games work... occlude each other?

Lots of games actually don't have moving objects occluding sound sources behind them, unless the occluding objects are quite large. For example, humanoid objects typically don't do any occlusion, but large tanks might. These solutions often involve geometric tricks, such as ray casts, to determine whether or not to engage a low pass filter. As we don't currently support this feature, we don't have a recommended solution to point you towards at this time.

AvivSham commented 3 years ago

What if we rebake each frame in the scene? first rebake and then simulate the audio signal?

MikeChemi commented 3 years ago

That's certainly one possibility, although I'm not sure how feasible it is. Brainstorming through the specifics...

Assuming a frame rate of 60FPS, that gives you 16 milliseconds (worst case) to change from one bake to the next, do the new query, and apply changes. In general, audio engines operate at 20ms cadences (not always true, but good average), so you could probably get away with updates less than this. Call it every other frame, and that gives you 32 milliseconds. Loading an ACE file and doing a query in that time is a tight squeeze. So maybe you'd need multiple instances of the acoustics manager, and swap between them to let each one load separately.

Then there's the issue of bake management. How long of a duration are you looking for? Assuming we use the 30 bakes per second from prior, that's a lot of bakes to both generate and manage. 1800 per minute! Our engine is not setup to share data between bakes, so you're looking at a lot of disk space and file I/O.

Assuming none of this is prohibitive, the last bit is coding it up. This will be very specific to your use, and will need lots of careful tuning, given the tight timing budgets.

I honestly would not recommend this approach. I have no idea if it would actually work or how it would sound. I would urge you to experiment with run-time solutions for this type of effect.