Closed guest271314 closed 4 years ago
No, they're orthogonal.
@Pehrsons Do not get that from the specification
A muted track will however, regardless of the enabled state, render silence and blackness. A disabled track is logically equivalent to a muted track, from a consumer point of view.
Enabled/disabled on the other hand is available to the application to control (and observe) via the enabled attribute. The result for the consumer is the same in the sense that whenever MediaStreamTrack is muted or disabled (or both) the consumer gets zero-information-content, which means silence for audio and black frames for video. In other words, media from the source only flows when a MediaStreamTrack object is both unmuted and enabled. For example, a video element sourced by a muted or disabled MediaStreamTrack (contained in a MediaStream ), is playing but rendering blackness.
Is your interpretation of the specification that setting enabled
to false should not directly initiate any event being dispatched; specifically the mute
event?
If that is the case then what is the purpose of the enabled
attribute?
And further, if the MediaStreamTrack
is neither muted
nor ended
when src
of a <video>
element changes or when enabled
is set to false
then MediaRecorder
should not stop recording https://bugs.chromium.org/p/chromium/issues/detail?id=957340.
@Pehrsons Do not get that from the specification
A muted track will however, regardless of the enabled state, render silence and blackness. A disabled track is logically equivalent to a muted track, from a consumer point of view. This states, quite explicitly, that muted and enabled are orthogonal.
Enabled/disabled on the other hand is available to the application to control (and observe) via the enabled attribute. There you have it. Enabled is a control surface for the application.
The result for the consumer is the same in the sense that whenever MediaStreamTrack is muted or disabled (or both) the consumer gets zero-information-content, which means silence for audio and black frames for video. In other words, media from the source only flows when a MediaStreamTrack object is both unmuted and enabled. For example, a video element sourced by a muted or disabled MediaStreamTrack (contained in a MediaStream ), is playing but rendering blackness.
Is your interpretation of the specification that setting
enabled
to false should not directly initiate any event being dispatched; specifically themute
event?
It should not.
If that is the case then what is the purpose of the
enabled
attribute?
Per above, a control for the application. "muted" is for the UA to signal to the application whether the track's source is producing something or not. Setting or unsetting the "muted" state is usually explicitly outlined in specs, like received tracks from peer connections.
And further, if the
MediaStreamTrack
is neithermuted
norended
whensrc
of a<video>
element changes or whenenabled
is set tofalse
thenMediaRecorder
should not stop recording https://bugs.chromium.org/p/chromium/issues/detail?id=957340.
If you're capturing a video element with captureStream()
and changing the element's src
, tracks will end, so then a MediaRecorder recording those tracks will stop.
Fiddling with the tracks' enabled
state should not affect a MediaRecorder recording those tracks. As you quoted above they should be recorded as black or silence, depending on what kind of track it is.
@Pehrsons One reason for filing this issue is trying to use unmute
, mute
and ended
events for control flow. From what have found those events are not dispatched reliably, specifically at Chromium (when trying to compose code that outputs the same result at Firefox and Chromium, accounting the the obvious differences between the two browsers; e.g., mozCaptureStream()
). While Firefox does not currently support WritableStream
or grabFrame()
, consider this issue https://bugs.chromium.org/p/chromium/issues/detail?id=967459, where in recent tests unmute
and mute
are dispatched for only 1 MediaStreamTrack
among 7 MediaStreamTrack
s and ended
event is not ever dispatched https://bugs.chromium.org/p/chromium/issues/detail?id=967459.
In such a case should the developer create a recursive function which polls readyState
to determine if the track is "ended"?
For a bug in Chromium, ask the Chromium folks what is the best workaround. If "ended" is buggy I can obviously not vouch here that readyState
won't be.
@Pehrsons Is setting and getting of enabled
attribute asynchronous?
@guest271314 is what asynchronous to setting enabled
?
Media flow turning video black/non-black or audio silent/audible? Yes, I'd say that's async. From the spec I interpret that setting the enabled attribute sets the enabled state synchronously. Setting the enabled state will in turn enable or disable the media flow from the source. Media flow from the source depends on the source but typically it doesn't flow on main thread so for those cases it cannot be sync with setting the enabled state on main thread (well, not without blocking the main thread anyway). Thus, even though the spec doesnt explicitly say so, no guarantee about it being sync can be given.
In Firefox it's definitely async.
@Pehrsons
is what asynchronous to setting enabled?
At pause
event of <video>
element, where ReadableStream
pull()
method has if
condition which checks if the current MediaStreamTrack
of kind "video"
is enabled
.
Media flow turning video black/non-black or audio silent/audible?
In this case a video track captured from an HTMLVideoElement
.
The initial motivation was to use mute
event to pause MediaRecorder
, set next MediaStreamTrack
s (specifically derived from captureStream()
) to be recorded, which given your feedback, is not appropriate; neither is ended
. pause
event of <video>
is the one event from which other events related to processing (recording) multiple MediaStreamTrack
s at the same <video>
element where the src
is changed (without using WebRTC). Which is ok to be aware of.
For a bug in Chromium, ask the Chromium folks what is the best workaround.
(Resolved the issue by setting the ReadableStreamDefaultController
as a property of a Map
outside of pull()
then executing close()
outside of pull()
. Am still interested in becoming aware of the exact expected control flow between async functions, DOM
events (addtrack
of MediaStream
returned from captureStream()
; pause
event of HTMLVideoElement
where the captured MediaStreamTrack
media source is derived from, where the src
of the HTMLVideoElement
is subsequently changed, which fires addtrack
again) as to why two versions of the code output different (and variable; perhaps related to garbage collection, "event loop" (DOM
events), task queues, microtasks (async/await
; Promise
?) results, which have asked Chromium developers about)
@Pehrsons Interestingly found a case where Chromium does dispatch mute
event though Firefox does not dispatch mute
event; in fact the canvas MediaStreamTrack
enabled
is true
and muted
is false
even when images are no longer drawn onto a <canvas>
element and captured using requestFrame()
. In addition, the ImageBitmap
s set at <canvas>
are not captured at all; instead of black frames being rendered the <video>
rendering the output is transparent, with the resulting Blob
not playable.
Given a
MediaStream
returned fromcaptureStream()
whereaddtrack
event is set and within that handleronmute
is set at theMediaStreamTrack
, ismute
event expected to be dispatched at whenenabled
attribute is set to false?