huggingface / tflite-android-transformers

DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps
Apache License 2.0
391 stars 81 forks source link

GPT2 error : ByteBuffer is not a valid flatbuffer model #15

Open jason9693 opened 3 years ago

jason9693 commented 3 years ago

When I was clone & run your GPT2 example code, there are some runtime error when try to load model.

How can I run this code?

==Error MSG== Process: co.huggingface.android_transformers.gpt2, PID: 16252 java.lang.IllegalArgumentException: ByteBuffer is not a valid flatbuffer model at org.tensorflow.lite.NativeInterpreterWrapper.createModelWithBuffer(Native Method) at org.tensorflow.lite.NativeInterpreterWrapper.(NativeInterpreterWrapper.java:60) at org.tensorflow.lite.Interpreter.(Interpreter.java:224) at co.huggingface.android_transformers.gpt2.ml.GPT2Client$loadModel$2.invokeSuspend(GPT2Client.kt:138) at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) at kotlinx.coroutines.DispatchedTask.run(Dispatched.kt:241) at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:594) at kotlinx.coroutines.scheduling.CoroutineScheduler.access$runSafely(CoroutineScheduler.kt:60) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:740)