w3c / mediacapture-fromelement

API to create a MediaStream from Media Element
https://w3c.github.io/mediacapture-fromelement
Other
21 stars 15 forks source link

Should tracks captured from a media element fire "ended" when ending? #77

Closed Pehrsons closed 5 years ago

Pehrsons commented 5 years ago

Spec reads:

A captured MediaStreamTrack ends when playback ends (and the ended event fires) or when the track that it captures is no longer selected or enabled for playback. A track is no longer selected or enabled if the source is changed by setting the src or srcObject attributes of the media element. The steps in MediaStreamTrack.stop() are performed on the MediaStreamTrack when it ends.

The steps for MediaStreamTrack.stop() do not fire "ended". IIRC this was motivated by there not being a need since the application knows it called stop anyway.

I would have expected us to fire "ended" here. It does seem like the original intention per https://github.com/w3c/mediacapture-fromelement/issues/23#issuecomment-277104443.

martinthomson commented 5 years ago

Yes, I think that citing stop() was a matter of convenience, but then it stopped doing the one thing it was being referenced for.

guest271314 commented 5 years ago

@Pehrsons Interestingly at Chromium 75 stop() does not dispatch ended event and setting enabled to false does not dispatch muted event.

Pehrsons commented 5 years ago

@guest271314 Good, then they're spec compliant.

guest271314 commented 5 years ago

@Pehrsons The ended event should be dispatched when stop() is called on a MediaStreamTrack, meaning Chromium implementation is not specification compliant (https://github.com/w3c/mediacapture-fromelement/issues/78; https://github.com/w3c/mediacapture-fromelement/issues/78#issuecomment-492297460; https://bugs.chromium.org/p/chromium/issues/detail?id=963018), correct?

https://www.w3.org/TR/mediacapture-streams/

A MediaStreamTrack object is said to end when the source of the track is disconnected or exhausted.

When a MediaStreamTrack track ends for any reason other than the stop() method being invoked, the User Agent MUST queue a task that runs the following steps:

  1. If the track's readyState attribute has the value ended already, then abort these steps.

  2. Set track's readyState attribute to ended.

  3. Notify track's source that track is ended so that the source may be stopped, unless other MediaStreamTrack objects depend on it.

  4. Fire a simple event named ended at the object.

From this perspective (front-end) if ended is not dispatched when stop() is called and muted is not dispatched when enabled set to false they are unreliable and essentially useless.

Consider an excerpt of code from https://github.com/guest271314/MediaFragmentRecorder/blob/imagecapture-audiocontext-readablestream-writablestream/MediaFragmentRecorder.html (similar code using addtrack at this branch https://github.com/guest271314/MediaFragmentRecorder/blob/webrtc-replacetrack-htmlmediaelement-capturestream-addtrack/MediaFragmentRecorder.html)

           // catureStream() called before playback begins
            // we will try to rely on MediaStream.onaddtrack event,
            // muted and ended events as signals to perform tasks
            const playlistStream = captureStream(playlist);
            const recorder = new MediaRecorder(mediaStreamDestination.stream, {
              // https://www.mpegla.com/wp-content/uploads/n-10-08-26.pdf
              mimeType: "video/x-matroska;codecs=avc1"
            });
            // ...
            playlistStream.addEventListener("addtrack", async e => {
              console.log(e.type, e.track);
              if (e.track.kind === "video") {
                try {
                  await e.track.applyConstraints(videoConstraints);
                  const {id} = e.track;
                  const imageCapture = new ImageCapture(e.track);
                  new ReadableStream({
                      async start(controller) {
                        console.log("reader start");
                        return readableStreamControllers.set(id, controller);
                      },
                      async pull(controller) {
                        if (e.track.enabled && e.track.readyState === "live") {
                          // Firefox ImageCapture does not implement grabFrame
                          // createImageBitmap with <video> as parameter throws error if video is not playing first
                          controller.enqueue(await imageCapture.grabFrame());
                        };
                      }
                    })
                    // Firefox has not implemented WritableStream
                    .pipeTo(new WritableStream({
                      write(imageBitmap) {
                          ctx.transferFromImageBitmap(imageBitmap);
                          if ("requestFrame" in canvasStream) {
                            canvasStream.requestFrame()
                          } else {
                            canvasStream.getVideoTracks()[0].requestFrame()
                          }
                          imageBitmap.close();
                        }, close() {
                          console.log("writer close");
                        }
                    }))
                    .catch(e => {
                      throw e;
                    })
                } catch (e) {
                  console.error(e, e.name === "OverconstrainedError");
                }
              } else {
                // Chromium has not implemented createMediaStreamTrackSource
                const audioTrack = audioContext.createMediaStreamSource(new MediaStream([e.track]));
                audioTrack.connect(mediaStreamDestination);
              }
              if (recorder.state === "inactive") {
                recorder.start();
              } else {
                if (recorder.state === "paused") {
                  recorder.resume();
                }
              }
              // these events are not dispatched - are unreliable to use within the code
              e.track.addEventListener("mute", e => {
                console.log(e.type, e.track);
              });
              e.track.addEventListener("unmute", e => {
                console.log(e.type, e.track);
              });
              e.track.addEventListener("ended", e => {
                console.log(e.type, e.track);
              });
            });

At pause event of video call stop() on current video track and set enabled to false

The below is essentially useless and unreliable as ended and muted are not ever dispatched on the MediaStreamTrack either by no media being streamed (internally, without user action; HTMLVideoElement pause event presumably is signal that no media is being streamed) or by the end-user calling stop() or setting enabled to false

playlist.addEventListener("pause", e => {
                  console.log(e);
                  recorder.pause();
                  const currentVideoTrack = playlistStream.getVideoTracks().find(({enabled}) => enabled);
                  const {id} = currentVideoTrack;
                  currentVideoTrack.enabled = false;
                  currentVideoTrack.stop();
                  // calling close within ReadableStream only occasionally translate to close of WritableStream being called https://github.com/web-platform-tests/wpt/issues/17011
                  readableStreamControllers.get(id).close();
                  console.log(currentVideoTrack);
                  resolve();
                }, {
                  once: true
                });

If ended and mute are not ever called on the MediaStreamTrack then MediaRecorder should not stop and continue recording.

Pehrsons commented 5 years ago

@Pehrsons The ended event should be dispatched when stop() is called on a MediaStreamTrack, meaning Chromium implementation is not specification compliant (#78; #78 (comment); https://bugs.chromium.org/p/chromium/issues/detail?id=963018), correct?

Not correct per the latest draft: https://w3c.github.io/mediacapture-main/#dom-mediastreamtrack-stop

It used to be however, but it was removed, motivated with "the application already knows since it called stop()" or an argument like that. You can probably blame your way to it if you have some spare time on your hands.

https://www.w3.org/TR/mediacapture-streams/

Yeah, that's a version from Oct 3, 2017.

A MediaStreamTrack object is said to end when the source of the track is disconnected or exhausted. When a MediaStreamTrack track ends for any reason other than the stop() method being invoked, the User Agent MUST queue a task that runs the following steps:

  1. If the track's readyState attribute has the value ended already, then abort these steps.
  2. Set track's readyState attribute to ended.
  3. Notify track's source that track is ended so that the source may be stopped, unless other MediaStreamTrack objects depend on it.
  4. Fire a simple event named ended at the object.

From this perspective (front-end) if ended is not dispatched when stop() is called and muted is not dispatched when enabled set to false they are unreliable and essentially useless.

They are certainly not useless. "ended" is dispatched for all reasons of ending the track other than stop(). Such reasons could be:

Similarly, "muted" is dispatched when the UA wants to signal that the source of the track is not producing any data. The most prominent use of this that I know is in peer connections, where a track is muted after it has been signaled but before its connection is up (and producing data).

For the rest of your comment I didn't see a question.

guest271314 commented 5 years ago

@Pehrsons

They are certainly not useless. "ended" is dispatched for all reasons of ending the track other than stop(). Such reasons could be:

  • a captured media element's track ended

Those events are useless at Chromium as neither event is fired.

Pehrsons commented 5 years ago

Please file implementation issues at the relevant implementation's issue tracker.

guest271314 commented 5 years ago

@Pehrsons Already did https://bugs.chromium.org/p/chromium/issues/detail?id=957340. Have not had a chance to find the blame for the change. Again, in front-end code, such configuration makes ended event useless (at least at Chromium). From perspective here, intuitively, ended event should be fired when the MediaStreamTrack is stopped or ended.