WebAudio / web-audio-api

The Web Audio API v1.0, developed by the W3C Audio WG
https://webaudio.github.io/web-audio-api/
Other
1.04k stars 165 forks source link

MediaStreamTrack of MediaStreamAudioDestinationNode.stream MUST render/output silence (0) #2468

Closed guest271314 closed 2 years ago

guest271314 commented 2 years ago

Describe the issue

A clear and concise description of what the problem is in the spec.

The Web Audio API specification needs to include stronger (redundant, where a reader of this specification might not click on links to Media Capture and Streams specification, and should not need to) language as to what is required for a MediaStreamTrack in the present specification itself, not only link to the Media Capture and Streams specification, which Web Audio API relies on for MediaStream and MediaStreamTrack definitions and capabilities.

https://www.w3.org/TR/mediacapture-streams/#life-cycle-and-media-flow:

A muted or disabled MediaStreamTrack renders either silence (audio) ...

A muted track will however, regardless of the enabled state, render silence ...

The result for the consumer is the same in the sense that whenever MediaStreamTrack is muted or disabled (or both) the consumer gets zero-information-content, which means silence for audio ...

I suggest adding the following bold, italic text to this specification so there is no ambiguity as to what is expected, using language already in the specification for 1.7. The AudioScheduledSourceNode Interface re silence:

... This interface is an audio destination representing a MediaStream with a single MediaStreamTrack whose kind is "audio". The MediaStreamTrack MUST render/output silence (0) - before any AudioNodes are connected to and after any AudioNodes are disconnected from the MediaStreamAudioDestinationNode.

If it's really an implementation bug, considering filing an issue for your browser at

Already filed. Marked as WontFix merely due to Chrome banning me, thus I could not post any further on bugs.chromium.org. The initial respondents evidently did not gather what the bug report is about even though it is spelled out in the title/description Issue 1262796: MediaStreamTrack does not render silence https://bugs.chromium.org/p/chromium/issues/detail?id=1262796.

Where Is It

1.23. The MediaStreamAudioDestinationNode Interface

This interface is an audio destination representing a MediaStream with a single MediaStreamTrack whose kind is "audio". This MediaStream is created when the node is created and is accessible via the stream attribute. This stream can be used in a similar way as a MediaStream obtained via getUserMedia(), and can, for example, be sent to a remote peer using the RTCPeerConnection (described in [webrtc]) addStream() method.

Additional Information

The following code demonstrates Chrome/Chromium is not in conformance with the Media Capture and Streams specification as to MediaStreamTrack of kind "audio" rendering silence. Live demonstration at plnkr https://plnkr.co/edit/XNwNwANBuMzaBKxj?preview. On Firefox 96 both <audio> elements play, on Chromium 100 neither <audio> element plays.

<!DOCTYPE html>

<html>
  <head>
    <title>MediaStreamTrack does not render silence on Chromium</title>
    <!-- https://bugs.chromium.org/p/chromium/issues/detail?id=1262796 -->
    <!-- https://www.w3.org/TR/mediacapture-streams/#life-cycle-and-media-flow -->
  </head>

  <body>
    <script>
      var webrtc = new RTCPeerConnection();
      var transceiver = webrtc.addTransceiver('audio');
      var { track: webrtc_track } = transceiver.receiver;
      var webrtc_audio_element = new Audio();
      webrtc_audio_element.controls = webrtc_audio_element.autoplay = true;
      document.body.appendChild(webrtc_audio_element);
      webrtc_audio_element.srcObject = new MediaStream([webrtc_track]);
      webrtc_audio_element.ontimeupdate = webrtc_audio_element.onplaying = (
        e
      ) =>
        console.assert(e.target.currentTime > 0, [
          e.target.currentTime,
          e.type,
        ]);

      var ac = new AudioContext();
      var msd = new MediaStreamAudioDestinationNode(ac);
      var { stream } = msd;
      var [webaudio_track] = stream.getAudioTracks();
      var webaudio_element = new Audio();
      webaudio_element.controls = webaudio_element.autoplay = true;
      document.body.appendChild(webaudio_element);
      webaudio_element.srcObject = new MediaStream([webaudio_track]);
      webaudio_element.ontimeupdate = webaudio_element.onplaying = (e) =>
        console.assert(e.target.currentTime > 0, [
          e.target.currentTime,
          e.type,
        ]);
    </script>
  </body>
</html>

I don't think Chrome has ever acknowledged the fact that audio MediaStreamTrack does not render silence. Therefore the non-conformance with the controlling specification and implementation bug will never be fixed without first acknowledging the bug. That can only occur by making it clear there is non-conformance in this regard, and not accepting the bug not being fixed. WPT tests MediaStreamAudioDestinationNode rendering/outputting silence should also be added, to make sure there is no denying or ignoring the facts (by dependent specification, that is, Web Audio API).

The failure to render/output silence for MediaStreamTrack in MediaStreamAudioDestinationNode.stream adversely impacts any downstream graph expecting silence to be rendered perpetually - even without an audio node (source) connected thereto.

hoch commented 2 years ago

It looks like a bug report for Chrome implementation. I'll take this and open (or reopen) relevant issues on crbug.com.

hoch commented 2 years ago

I reopened crbug.com/1262796 (which was closed due to inactivity) and will engage with the project owner there.

Based on WG's discussion on 5/5/2022, we believe the spec change on Web Audio API is not needed. That said, please feel free to reopen the issue if needed.