socketio / socket.io-p2p

MIT License
1.02k stars 180 forks source link

Nothing happens when sending over 10kb #12

Open PixelsCommander opened 8 years ago

PixelsCommander commented 8 years ago

Found there is a limit on max data size to emit and it is about 10kb. Any hints on how to fix it? Can contribute if it is easy enough issue for a WebRTC newbie.

dalesbit commented 8 years ago

could you give me/us some code?

tomcartwrightuk commented 8 years ago

Is this when you are sending a blob of data nested in a packet e.g. {stuff: {things: <blob of data>}}?

PixelsCommander commented 8 years ago

No, I mean just string. Sending HTML inside of blob just crashes a client.

2015-09-14 12:19 GMT+02:00 tomcartwrightuk notifications@github.com:

Is this when you are sending a blob of data nested in a packet e.g. {stuff: {things: }}?

— Reply to this email directly or view it on GitHub https://github.com/socketio/socket.io-p2p/issues/12#issuecomment-140029125 .

PixelsCommander commented 8 years ago

Maximum string length workaround

function sendData(data, name) {
            var chunkLength = 16000;
            var chunksNumber = Math.ceil(data.length / chunkLength);
            for (var i = 0; i < chunksNumber; i++) {
              var chunkStart = i * chunkLength;
              var chunkEnd = Math.min((i + 1) * chunkLength, data.length);
              var chunk = data.substring(chunkStart, chunkEnd);
              p2p.emit(name, chunk);
            }
            p2p.emit(name, 'end');
}
tomcartwrightuk commented 8 years ago

Looks like the sort of thing we should do - I will have a go at pushing it down into the parsing lib and do a release of that.

monteslu commented 8 years ago

This is a major problem.
Hit it when trying to send an large JSON encoded string. The decodeString method is trying to parse chunks instead of the entire message. Ends up killing chrome in my case. (sending an object with a base64 dataUrl for an image i want to serialize).

monteslu commented 8 years ago

BTW, sending the exact same message through a socket.io server doesn't have this problem. Seems there's no cap on the size of the ws frame.

PixelsCommander commented 8 years ago

This is a WebRTC data channel limitation, not WS

jimmywarting commented 8 years ago

Wouldn't socket.io-stream be a grate solution to this? You can stream, pipe, pause, resume and use a high watermark

aladagemre commented 8 years ago

When sending files over 10kb, it gives the following error on console:

Uncaught TypeError: this._appendBuffer is not a function

Code is like below from https://github.com/tomcartwrightuk/socket.io-p2p-parser/blob/master/index.js#L374

BinaryReconstructor.prototype.takeBinaryData = function(binData) {
  this.buffers.push(binData);
  if (this.buffers.length == this.reconPack.attachments) { // done with buffer list
    this.reconPack.data['data'] = this.buffers.reduce(function(prev, curr, idx, arr) {
      return this._appendBuffer(prev, curr);
    })
    binary.reconstructPacket(this.reconPack, [this.reconPack.data['data']]);
    var packet = this.reconPack
    this.finishedReconstruction();
    return packet;
  }
  return null;

};

Problem is that "this" in the line "return this._appendBuffer(prev, curr);" corresponds to Window object. Since there's no Window._appendBuffer, it prints the given error. Why is "this" defined incorrectly?

"this" is defined correctly in the first line of if statement and first line of BinaryReconstructor class.

How can we fix this?