Open gasabr opened 2 years ago
Please, add the attachment @gasabr. Could you try to repeat the experiment with the latest version of the onnx-dependency, here
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-api:0.3.0")
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-onnx:0.3.0")
Thanks for the response! I tried 0.3.0 got the same exception, I also have tried to use java onnxruntime and was able to inference the model with following code:
val env = OrtEnvironment.getEnvironment()
val session = env.createSession("/tmp/twoTierModel.onnx", OrtSession.SessionOptions())
val features = (1..27).map { Random.nextFloat() }.toFloatArray()
val buf = FloatBuffer.wrap(features)
val t1 = OnnxTensor.createTensor(env, buf, longArrayOf(1, 27))
val inputs = mapOf<String, OnnxTensor>("features" to t1)
val result = session.run(inputs, setOf("probabilities"))[0].value as ArrayList<HashMap<Long, Float>>
println(result)
Thanks, @gasabr thanks, for the example you gave me, and I hope to fix it in the 0.4 release to cover more cases, but at this moment java onnxruntime is the best choice for you, I agree
I want to discuss a couple of things.
Onnx supports multiple output types such as tensors, sequence of numbers (or strings), sequence of maps, and a map. At first glance, it seems that it is possible to decode every type of input to appropriate Kotlin's data structure using OnnxModel metadata. But I have some doubts:
Another thing I want to discuss. For me, it seems reasonable if OnnxInferenceModel's methods predict
and predictSoftly
will be refactored out into more specific implementation class (like ClassificationOnnxInferenceModel).
It may be handy if OnnxInferenceModel will work with arbitrary tensors. Meanwhile, classes targeted for specific DL tasks (such as detection or segmentation) can use OnnxInferenceModel internally and format output for a specific task.
I'm trying to inference ONNX model created from lightgbm model via Kotlin DL and in every method (tried Raw ones too) i'm getting
class [J cannot be cast to class [[F ([J and [[F are in module java.base of loader 'bootstrap')
or in RawMethodsSequenceInfo cannot be cast to class ai.onnxruntime.TensorInfo
Env:
Code twoTierModel.txt
Error is in the line 124 of the file to Array, I'm not sure if the model should always return 3d Tensor or the lib should check the types.
OnnxInferenceModel.kt
and it's caused by the attempt to cast ListRename the attachment to
twoTierModel.onnx
to try the test at your machine