Closed asztalosdani closed 5 years ago
Thanks for reporting this @asztalosdani! @nasefbasdf Any idea what could be going on?
The extra newline is not picked up by the parser on the server side as part of the message. So It gets printed as anything else. https://github.com/pyblish/pyblish-qml/blob/39cd755da6c10206059aa91aae38012868d39c42/pyblish_qml/ipc/server.py#L229 An easy fix would be to just not transfer lines that contain a single newline to the host's stdout.
If that's not proper enough, then the read loop should buffer every string that's a single newline. On the arrival of the next line it can be determined if it was the preamble of a message or a legitimate newline that should be printed to the host's stdout.
An easy fix would be to just not transfer lines that contain a single newline to the host's stdout.
What if a plugin wants to print some newlines? Would it get swallowed by this or is it going through a seperate process? If it affects this, then I would go for the second solution.
Having a closer look at this; I'm not happy with the newlines around results, I think they hide a more fundamental problem that'll bite us again. I was considering making _dispatch
asynchonous , something whereby a callback is called on a result. But as _dispatch()
is used all over the place, that would warrant a significant refactoring. It'd be proper, as what we've got is effectively that; a function being called from multiple threads.
Instead of that, I wanted to narrow down what those "multiple threads" where, and found that there is only one thread aside from main involved in calling _dispatch
and that's Controller.iterator
. That function makes IPC calls in parallel and I think that's where the problem is.
However, I'm having trouble fixing this without actually doing that refactoring. The application is structured to make synchronous calls to sys.stdout
; the more recent asynchronous funcionaliy (primarily "emit()") were tacked on later. Will need to think about this one.. I think simply excluding \n
from results won't do, and the current \n message \n
doesn't actually solve the problem, but rather turns a deadlock into a mangled response (i.e. an unreliable publish, where a validator can fail but one extractor still makes it through).
the current
\n message \n
doesn't actually solve the problem, but rather turns a deadlock into a mangled response
What do you mean by that? The message was definitely mangled when its body got appended to an other line printed by the application. Can you give an example of mangled message when it is surrounded by newline characters?
As I understand it, this bug is cosmetic only (i.e. printing superfluous newlines). I would have looked at it on Monday to fix it.
Sure, here's what I think is happening.
iterator()
kicks off, sending messages over sys.stdout
whilst the GUI remains interactiveiterator()
is waiting for a reply.Whenever a request is sent, the _dispatch
function is locked, waiting for a reply to be put into the response
queue. If, while it's waiting, another quicker request were to get responded to, the response could go to the main thread, or it could go to the iterator
thread, and I don't think there's any way of telling? They both listen on the same queue, and the queue is being filled up asynchronously.
I think what should happen is for each message to be sent across asynchrously. For example each response could have an ID, e.g. 1
to which a response of ID 1
would get matched to a callback and run, e.g. callbacks[id](result)
Preferably, there would be an easier way and you are still very much welcome to have a look at it on Monday. At the very least we've definitely identified a race condition that I'm surprised hasn't been more of a problem.
Hi,
the fix in #325 introduced a funky side-effect. A ton of empty lines get printed to stdout (the output window of Maya). It has to be something with the two
\n
characters not parsed so printed to sdtout instead. You can check it by typing something into the comment field of the GUI. For every key pressed a newline (or maybe some?) is printed.@nasefbasdf, @mottosso can you take a look at it? :)