Flotype / now

NowJS makes it easy to build real-time web apps using JavaScript
http://www.nowjs.com
MIT License
1.91k stars 175 forks source link

Bug: high latency when sending uploaded chunks back to now.js client #51

Closed tommedema closed 13 years ago

tommedema commented 13 years ago

Test case: https://gist.github.com/918288

Description: Run this: sudo node uploadNowTest.js -> go to http://localhost:3000 in the browser. Select a semi-large file (eg. 3MB) --> upload. When removing the now.js broadcast, this is instant (as it should be, because this is localhost with 0 latency). When broadcasting the message with now.js, it takes about 5 seconds or so.

Expected behaviour:

  1. since this is run on the local host, the upload and broadcast to now.js client should be almost instant

This behaviour is seen when the now.js broadcast is removed.

However, when not removing the now.js broadcast it takes about 5 seconds. Please run the test case.

sridatta commented 13 years ago

I ran node-inspector and used the V8 profiler to identify what the bottleneck is. As I suspected, all the time was spent JSON.stringify-ing the chunks (which were ~40kB each).

What's the fix? We have considered ditching JSON and sending pure binary over the socket. Your issue may get us to seriously implement that.

tommedema commented 13 years ago

@sridatta, thanks for the debuging. Sending pure binary over the socket would result in a huge improvement in general and allow me to use now.js for this project as well. I hope this will get implemented.

tommedema commented 13 years ago

It looks like something like BSON would be great. I am not sure whether it has been used before with socket.io, but it sounds very promising.

dvv commented 13 years ago

msgpack?

tommedema commented 13 years ago

@dvv, looks interesting. Pgriess made what seems to be a pretty worked out library for it, see https://github.com/pgriess/node-msgpack

dvv commented 13 years ago

yep. for i fear workaround techniques used in socket.io won;t allow for pure binary data traffic.

sridatta commented 13 years ago

Wow, msgpack is impressive. we'll do some tests and use that for serialization if it's as good as they are claiming.

dvv commented 13 years ago

Hmm. On my node.js setup JSON.parse is faster than msgpack.unpack.

tommedema commented 13 years ago

@dvv, really? That's quite disappointing.

It's a true shame that Websockets do not allow binary transfer.

In that case I will limit my now.js usage to action and short message broadcasts, while for binary transfers (eg. files) I will respond to normal get requests.

dvv commented 13 years ago

I suppose you should first benchmark your setup. Plus, I guess chances are that for your static assets you'd still use a dedicated server

ericz commented 13 years ago

We've given this issue a lot of thought, and unfortunately there does't seem to be a simple solution.

The chunks take a long time because they have to be read into memory and then deserialized. Since the client-server communication is not a true tcp socket, we can't avoid that. Additionally msgpack isn't going to make anything faster because the binary has to be based64'ed to be transported anyways.

No good solution to this issue guys, closing. If someone comes up with any ideas, feel free to reopen