Open ssfrr opened 7 years ago
More (possibly) useful debugging:
echo 'SuperColliderJS.interpret(1234, "SCDoc.documents", nil, false)' | sclang > sclang_output
Dumps everything supercollider generates to a file. That file is about 250KB, so if there's any intermediate step that throws out data when it hits something smaller than that we're in trouble.
OK. I think I know what's going on. This gist shows the output from the above node script, with each stdout
callback displayed as:
------STDOUT-------
content
-------------------
at line 181 you see that the text that the callback is called with doesn't start with SUPERCOLLIDERJS
, because it's actually the end of the previous chunk. The callback handler for the ready
state called here assumes that there's no leftover text from the previous chunk and that the SUPERCOLLIDERJS
block starts at the beginning.
One solution might be to have the regex capture anything before the first SUPERCOLLIDERJS
and append it to the previous chunk.
Thank you for all that research. I'm on vacation right now. Maybe I will have some time to dig in.
On Thu, Sep 21, 2017, 07:19 Spencer Russell notifications@github.com wrote:
OK. I think I know what's going on. This gist https://gist.github.com/ssfrr/8115445c9fb40f36b2f9027a3cb7e640 shows the output from the above node script, with each stdout callback displayed as:
------STDOUT------- content
at line 181 https://gist.github.com/ssfrr/8115445c9fb40f36b2f9027a3cb7e640#file-gistfile1-txt-L181 you see that the text that the callback is called with doesn't start with SUPERCOLLIDERJS, because it's actually the end of the previous chunk. The callback handler for the ready state called here https://github.com/crucialfelix/supercolliderjs/blob/77a26c02cf1cdfe5b86699810ab46c6c315f975c/src/lang/internals/sclang-io.js#L197 assumes that there's no leftover text from the previous chunk and that the SUPERCOLLIDERJS block starts at the beginning.
One solution might be to have the regex capture anything before the first SUPERCOLLIDERJS and append it to the previous chunk.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/crucialfelix/supercolliderjs/issues/34#issuecomment-331053788, or mute the thread https://github.com/notifications/unsubscribe-auth/AANWchmUWz9rg-PkpbJnqlfvIH8KZ_Yxks5skfHsgaJpZM4PewIy .
I was looking for some kind of .flush method to call. There is one for IOStream but none for the main thread. Internally I know there is a buffer that can be flushed. sclang is a bit messy as you can see. This is why reading the compile startup is tricky because it comes out in blurps.
It looks like it had more to post so it just took a break to handle some event and then went back to posting.
A dedicated socket would certainly solve that.
Compressing the data before posting would help (but only if you had proper unicode support). You can of course fetch in smaller batches or dump to a file.
btw. I had assumed that it would be best to dump the API to a JSON file rather than query it live over the bridge. If you cannot boot sc (compile errors) then you can still get access to the autocomplete/API dump if the file is still around. That was my thinking. I don't have the slightest bit of time to work on it anyway (and supercollider is a silly language, eh?)
I'm working on implementing better autocomplete integration into the Atom editor package, which requires getting data from Supercollider as JSON data (rather than just as strings to display to the user). I've been having some issues that I traced down to supercolliderjs seemingly having trouble with commands that generate a lot of output data.
Here's an example that reproduces the problem that can be run from
node
:Sometimes this succeeds, but more often than not I get a parse error:
I dumped the actual
stdout
JS variable right before the attempted parsing and the output is here: https://gist.github.com/ssfrr/92c9f725185b68db76d960a5110caaa6Pasting into a json validator like jslint.com shows where the parsing is breaking down, and It looks like some random chunk got lost in the middle.
Is there a fixed-size buffer somewhere that is overflowing?