Open MattGson opened 1 week ago
Hi @MattGson thanks for raising the question.
In what way do you find API to be limiting due to async support?
It exists primarily because we want to provide high-performance bindings for e.g. Node.JS which can asynchronously load data from DB/S3 or similar (through async non-blocking fns). Also, it will be harder to add async support later if we find use case for it outside of the previously mentioend one.
You should be able to synchronously evaluate DecisionGraph using block_on
/ block_in_place
.
In what way do you find API to be limiting due to async support?
Mainly due to the fact that it requires an async runtime to evaluate any rules.
This is understandable for a libraries which suits async by nature such as a database client. But evaluating a business rule is ideally a pure operation.
To give you more context, let me give you two examples of async touchpoints:
Is there a specific reason that's limiting you from using async?
I've read the docs and the code, and cannot figure out why the evaluate API must be async?
This means any code that wants to use it must also be async which is extremely limiting.
As far as I can see, it only uses async to lazy load rules via async I/O. But the two built-in loaders are synchronous anyway. Could synchronous evaluation be supported when lazy loading rules is not required?