Closed marcus-oscarsson closed 3 months ago
The import in the function definitions are fine I think the issue Olof had here is that as its supposed to work standalone he had only this file and queue_model_enumerables
, so the rauml
import fails. We can leave the other imports as they are for the moment, but I don't mind following up wit another PR if you think its worth doing.
Maybe I could fix this if I understood the problem a bit better. AFAIAC there are certain actions that have to happen at certain points, when the functions that contain the imports are called. One could probably move the functionality to the calling functions in GphlWorkflow instead, but the same actions would still have to happen at the same time. Or, if that was impossible, it would be a major problem to reorganise. The yaml import, specifically, is only for a specific mock test scenario, so that would be dealt with more easily - if I was more sure exactly what the constraints are.
The original idea was that queue_model_objects
is supposed to be importable as it is and used by any software that wants to add queue models to the queue. In the first version of XMLRPCServer the file queue_model_objects.py was serialized and sent as part of the initialization process.
The code still exists, queue_get_model_code
The BES Workflows are still using this idea but we have been discussing to move away from this approach and favor a different kind of API. So this file is used by the workflow without the rest of MXCuBE or any specific dependency hence the issuer with for instance rauml
. Anything that is specific to your workflows and are within a def
does not pose a problem.
It was decided long ago that the queue_model_objects should be importable as it is and not contain or contain as little logic as possible, and contain classes only carrying data. We have deviated from the later but the module still needs to be importable without the rest of mxcubecore or non standard library dependencies.