React Structural Metadata Editor component which allows a user to interact with structural metadata, adding, editing and deleting headers and timespans. Also presents a visual waveform representation of the work audio file, for help in navigating and identifying sections of the waveform.
There are 2 setups in SME to create the Peaks instance;
SME calls the Peaks initialization in /WaveformContainer.js's useEffect hook listening to the streamMediaLoading state variable.
This state variable is updated in the HLS setup in /forms.js under retrieveStreamMedia(), which dispatches the streamMediaSuccess() call to set the stream status in the central state in SME.
The issue was that, Peaks was not able to create its instance with the way we were handling HLS on the SME side.
We pass the HTMLMediaElement via options to Peaks.js to create a player. And Peaks needs this HTMLMediaElement to contain an audio track.
And SME was starting the Peaks initialization before the HLS buffer had properly setup within the media element and Peaks was not seeing the audio track in the media element. And it needs this to initialize its player API to handle user interactions with the waveform on the page.
And somehow waiting for the BUFFER_APPENDED event in HLS fixes this issue.
I don't know why this came up now because, this is working fine with the most current Avalon Manifests. This only happened when an old Avalon Manifest is used, where we were using seeAlso to present waveform information.
There are 2 setups in SME to create the Peaks instance;
/WaveformContainer.js
'suseEffect
hook listening to thestreamMediaLoading
state variable./forms.js
underretrieveStreamMedia()
, which dispatches thestreamMediaSuccess()
call to set the stream status in the central state in SME.The issue was that, Peaks was not able to create its instance with the way we were handling HLS on the SME side. We pass the HTMLMediaElement via options to Peaks.js to create a player. And Peaks needs this HTMLMediaElement to contain an audio track. And SME was starting the Peaks initialization before the HLS buffer had properly setup within the media element and Peaks was not seeing the audio track in the media element. And it needs this to initialize its player API to handle user interactions with the waveform on the page. And somehow waiting for the
BUFFER_APPENDED
event in HLS fixes this issue.I don't know why this came up now because, this is working fine with the most current Avalon Manifests. This only happened when an old Avalon Manifest is used, where we were using
seeAlso
to present waveform information.