microsoft / ELL

Embedded Learning Library
https://microsoft.github.io/ELL
Other
2.29k stars 294 forks source link

Dynamic Memory Allocation and Exceptions #143

Closed sdudeck closed 6 years ago

sdudeck commented 6 years ago

Hello, we are looking at different machine learning frameworks suitable to run on embedded systems at the moment. One of the topics that came up in our discussions were memory allocation (static or dynamic?) and exception handling, as some of my colleagues seem to have had bad experiences with dynamic memory allocation and exceptions on these limited devices. As far as we could see from the code and generated model binaries, there is only static memory allocation used in the model and no exceptions are thrown within the model binary. Is this correct? Is this a design specification/decision of yours to not use dynamic memory allocation throughout the inference code? So the chances are good it stays like that? Same with exception handling?

Thank you very much, Sven

lovettchris commented 6 years ago

Correct, currently compile generates code using statically allocated buffers, so there are no calls to malloc/free when you call the predict function. You can actually run our "compile" tool with "--output ir" or "assembler" and look at the code we generate. We don't throw exceptions either since this is also callable from "C" programs. I don't know if we promise never to change that, but you can see that our CompilableNode API defines a Compile method that takes an IRFunctionEmitter and that this function emitter is trying to steer Node implementors to implement simple functions, for loops, if statements, vector math, etc. But there is a "call" function and a node could call out to client code and that client code could do whatever (which is true in the "callback" model case).

sdudeck commented 6 years ago

Thanks al lot - that helps our discussions.