nickdesaulniers / netfix

Let's build a Netflix
http://nickdesaulniers.github.io/netfix
172 stars 129 forks source link

Trying to understand "bufferWhenNeeded" example #8

Open falk-stefan opened 3 years ago

falk-stefan commented 3 years ago

Hi!

I am trying to implement an Audio-Player based on this example from https://github.com/nickdesaulniers/netfix in my Angular application.

However, I am struggling quite a bit to get things working. I'd have one question in particular: fetchRange() requests a particular chunk but how does that chunk have to look like in order for the MediaSource to digest it correctly?

So.. if we send start and end.. are we really talking about byte-offsets of the particular file I am currently playing? Because in this case my server-endpoint would just return a byte array that contains those selected bytes but without a file header.

Does it work like this?

Doesn't that mean that we have to call fetch(url, 0, segmentLength, callback) at first before we start streamin in order for MediaSource to be able to read in information such as sample rate etc.?


function fetchRange (url, start, end, cb) {
  var xhr = new XMLHttpRequest;
  xhr.open('get', url);
  xhr.responseType = 'arraybuffer';
  xhr.setRequestHeader('Range', 'bytes=' + start + '-' + end);
  xhr.onload = function () {
    console.log('fetched bytes: ', start, end);
    bytesFetched += end - start + 1;
    cb(xhr.response);
  };
  xhr.send();
};

Thank you for any help on this ..