wandenberg / nginx-push-stream-module

A pure stream http push technology for your Nginx setup. Comet made easy and really scalable.
Other
2.22k stars 295 forks source link

long-polling and jsonp problem when messages are published quickly #24

Closed aquam8 closed 12 years ago

aquam8 commented 12 years ago

Hi Wandenberg,

I have been trying to use long-polling with JSONP (again ;-) )and it works ok when messages are sent at a slow rate (1msg every 3 seconds) but when messages are send every second (or less) then I noticed that the long-polling response contains multiple jsonp callbacks (see below)

"" long-polling response """ (as captured) PushStreamManager[0].wrapper.onmessage( '{"id":13,"channel":"test3","text":"test message","eventid":""}' );

PushStreamManager[0].wrapper.onmessage( '{"id":14,"channel":"test3","text":"test message","eventid":""}' );

This then causes 2 connections to be opened and causes even more trouble. (you end up with dozen of long-polling connections)

It would seem that client/me should only expect a single jsonp callback per long-polling connection. Possibly containing an array of messages like:

PushStreamManager[0].wrapper.onmessage( [ '{"id":13,"channel":"test3","text":"test message","eventid":""}' , '{"id":14,"channel":"test3","text":"test message","eventid":""}' ] );

That way the client can possibly skip messages and that guarantees that no other connection will be created. (my recommandation)

Or write the second message in the next long-polling connection, but that could lead to delays..

Thank you for trying to reproduce this behaviour and letting me know what you think of this

Cheers / Matt

wandenberg commented 12 years ago

Hi Matt,

in fact I'm already working on that (other user reported the problem too). I'm trying to do this in a way that old versions keep working, but I don't know if it will be possible. May be I need to do a breaking version.

I'll keep you informed about this. Give me a couple of days.

Regards, Wandenberg

On Thu, Apr 12, 2012 at 2:49 AM, Matt Lohier < reply@reply.github.com

wrote:

Hi Wandenberg,

I have been trying to use long-polling with JSONP (again ;-) )and it works ok when messages are sent at a slow rate (1msg every 3 seconds) but when messages are send every second (or less) then I noticed that the long-polling response contains multiple jsonp callbacks (see below)

"" long-polling response """ (as captured) PushStreamManager[0].wrapper.onmessage( '{"id":13,"channel":"test3","text":"test message","eventid":""}' );

PushStreamManager[0].wrapper.onmessage( '{"id":14,"channel":"test3","text":"test message","eventid":""}' );

This then causes 2 connections to be opened and causes even more trouble. (you end up with dozen of long-polling connections)

It would seem that client/me should only expect a single jsonp callback per long-polling connection. Possibly containing an array of messages like:

PushStreamManager[0].wrapper.onmessage( [ '{"id":13,"channel":"test3","text":"test message","eventid":""}' , '{"id":14,"channel":"test3","text":"test message","eventid":""}' ] );

That way the client can possibly skip messages and that guarantees that no other connection will be created. (my recommandation)

Or write the second message in the next long-polling connection, but that could lead to delays..

Thank you for trying to reproduce this behaviour and letting me know what you think of this

Cheers / Matt


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24

wandenberg commented 12 years ago

Hi Matt,

could you help me to test the branch longpolling_with_array_support. It is with the code to support send old messages as an array like your example on last email.

Regards, Wandenberg

On Thu, Apr 12, 2012 at 9:36 AM, Wandenberg Peixoto wandenberg@gmail.comwrote:

Hi Matt,

in fact I'm already working on that (other user reported the problem too). I'm trying to do this in a way that old versions keep working, but I don't know if it will be possible. May be I need to do a breaking version.

I'll keep you informed about this. Give me a couple of days.

Regards, Wandenberg

On Thu, Apr 12, 2012 at 2:49 AM, Matt Lohier < reply@reply.github.com

wrote:

Hi Wandenberg,

I have been trying to use long-polling with JSONP (again ;-) )and it works ok when messages are sent at a slow rate (1msg every 3 seconds) but when messages are send every second (or less) then I noticed that the long-polling response contains multiple jsonp callbacks (see below)

"" long-polling response """ (as captured) PushStreamManager[0].wrapper.onmessage( '{"id":13,"channel":"test3","text":"test message","eventid":""}' );

PushStreamManager[0].wrapper.onmessage( '{"id":14,"channel":"test3","text":"test message","eventid":""}' );

This then causes 2 connections to be opened and causes even more trouble. (you end up with dozen of long-polling connections)

It would seem that client/me should only expect a single jsonp callback per long-polling connection. Possibly containing an array of messages like:

PushStreamManager[0].wrapper.onmessage( [ '{"id":13,"channel":"test3","text":"test message","eventid":""}' , '{"id":14,"channel":"test3","text":"test message","eventid":""}' ] );

That way the client can possibly skip messages and that guarantees that no other connection will be created. (my recommandation)

Or write the second message in the next long-polling connection, but that could lead to delays..

Thank you for trying to reproduce this behaviour and letting me know what you think of this

Cheers / Matt


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24

aquam8 commented 12 years ago

Hi Wandenberg,

Sure thank you for asking, we will test the branch and report tomorrow.

Thanks!

Matt Lohier Sent with Sparrow (http://www.sparrowmailapp.com/?sig)

On Tuesday, 17 April 2012 at 3:37 PM, Wandenberg Peixoto wrote:

Hi Matt,

could you help me to test the branch longpolling_with_array_support. It is with the code to support send old messages as an array like your example on last email.

Regards, Wandenberg

On Thu, Apr 12, 2012 at 9:36 AM, Wandenberg Peixoto <wandenberg@gmail.com (mailto:wandenberg@gmail.com)>wrote:

Hi Matt,

in fact I'm already working on that (other user reported the problem too). I'm trying to do this in a way that old versions keep working, but I don't know if it will be possible. May be I need to do a breaking version.

I'll keep you informed about this. Give me a couple of days.

Regards, Wandenberg

On Thu, Apr 12, 2012 at 2:49 AM, Matt Lohier < reply@reply.github.com (mailto:reply@reply.github.com)

wrote:

Hi Wandenberg,

I have been trying to use long-polling with JSONP (again ;-) )and it works ok when messages are sent at a slow rate (1msg every 3 seconds) but when messages are send every second (or less) then I noticed that the long-polling response contains multiple jsonp callbacks (see below)

"" long-polling response """ (as captured) PushStreamManager[0].wrapper.onmessage( '{"id":13,"channel":"test3","text":"test message","eventid":""}' );

PushStreamManager[0].wrapper.onmessage( '{"id":14,"channel":"test3","text":"test message","eventid":""}' );

This then causes 2 connections to be opened and causes even more trouble. (you end up with dozen of long-polling connections)

It would seem that client/me should only expect a single jsonp callback per long-polling connection. Possibly containing an array of messages like:

PushStreamManager[0].wrapper.onmessage( [ '{"id":13,"channel":"test3","text":"test message","eventid":""}' , '{"id":14,"channel":"test3","text":"test message","eventid":""}' ] );

That way the client can possibly skip messages and that guarantees that no other connection will be created. (my recommandation)

Or write the second message in the next long-polling connection, but that could lead to delays..

Thank you for trying to reproduce this behaviour and letting me know what you think of this

Cheers / Matt


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24#issuecomment-5168668

sebcante commented 12 years ago

Hi Wandenberg,

I have been testing the long polling branch.

Do you reckon we should change the nginx.conf template to something like

push_stream_message_template "[~id~,'~channel~','~text~',~tag~,'~time~']";

and do the association in the javascript by array index ?

Thanks

-seb

wandenberg commented 12 years ago

Hi Seb,

in fact the if the text is a json it should be escaped. The use of pushstream.js to receive the message assumes that you use it to send the message too, check the function sendMessage on it.

I will review the docs to check if don't have any mistakes.

Use a positioning array isn't a bad idea, but seems to me fragile, since we have a better way, like json.

Anyway, the text should be always escaped to avoid XSS attacks, even using an array.

Regards, Wandenberg

On Wed, Apr 18, 2012 at 5:43 AM, sebcante < reply@reply.github.com

wrote:

Hi Wandenberg,

I have been testing the long polling branch.

  • it seems if 'text' is a json string, pushstream.js breaks when trying to parse the element of the array example : PushStreamManager[0].wrapper.onmessage( ['{"id":13,"channel":"test3","text":"{"key":"value"}","eventid":""}' ,]); This breaks on parseJSON(messageText) as the string is not valid json.

Do you reckon we should change the nginx.conf template to something like

push_stream_message_template "[~id~,'~channel~','~text~',~tag~,'~time~']";

and do the association in the javascript by array index ?

Thanks

-seb


Reply to this email directly or view it on GitHub:

https://github.com/wandenberg/nginx-push-stream-module/issues/24#issuecomment-5194167

wandenberg commented 12 years ago

Hi Matt,

have you any feedback about this? Do you think the code is mature to be merged with master?

Regards, Wandenberg

aquam8 commented 12 years ago

Hi Wandenberg,

Well Seb and I are working together on this and he spent more time than I last week trying to make it work but as you know he's having trouble. We'll work on it on Monday again together. We'll communicate again tomorrow thanks / Matt

Matt Lohier Sent with Sparrow (http://www.sparrowmailapp.com/?sig)

On Sunday, 22 April 2012 at 8:39 AM, Wandenberg Peixoto wrote:

Hi Matt,

have you any feedback about this? Do you think the code is mature to be merged with master?

Regards, Wandenberg


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24#issuecomment-5264178

wandenberg commented 12 years ago

Hi Seb and Matt,

any news about your tests?

aquam8 commented 12 years ago

Hi, yes we have tested everything. We'll reply to you together tomorrow.

Thank you Wandenberg Matt

Matt Lohier Sent with Sparrow (http://www.sparrowmailapp.com/?sig)

On Sunday, 29 April 2012 at 12:56 PM, Wandenberg Peixoto wrote:

Hi Seb and Matt,

any news about your tests?


Reply to this email directly or view it on GitHub: https://github.com/wandenberg/nginx-push-stream-module/issues/24#issuecomment-5402198

sebcante commented 12 years ago

All the tests we have done are looking good.

thanks for that Wandenberg

-seb