When creating a model object, the model file's ByteBuffer mustn't be collected by the GC.
However, this might not be done by users (even the Java example given doesn't do this...), and would thus lead to segmentation fault because the model's ByteBuffer is refered to during inference but is collected by the GC.
Expected behaviour:
Should be able to allocate a model object safely without having to care about the lifetime of the model's ByteBuffer. Or at the very least, there should be an explicit warning about this.
Current situation:
When creating a model object, the model file's ByteBuffer mustn't be collected by the GC. However, this might not be done by users (even the Java example given doesn't do this...), and would thus lead to segmentation fault because the model's ByteBuffer is refered to during inference but is collected by the GC.
Expected behaviour:
Should be able to allocate a model object safely without having to care about the lifetime of the model's ByteBuffer. Or at the very least, there should be an explicit warning about this.