Just change compile to implementation based on that github issue in tflite to solve the issue that you post above. for me it worked as a charm, but when I try to infer my models with some images, this error emerged Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [[F (which is compatible with the TensorFlowLite type FLOAT32). and the whole application crash. So if your model image based so check this answer its related to quantization of model when you export yours. Just do this:
Just change compile to implementation based on that github issue in tflite to solve the issue that you post above. for me it worked as a charm, but when I try to infer my models with some images, this error emerged Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [[F (which is compatible with the TensorFlowLite type FLOAT32). and the whole application crash. So if your model image based so check this answer its related to quantization of model when you export yours. Just do this:
config = QuantizationConfig.for_float16() model.export(export_dir='.', tflite_filename='model_fp16.tflite', quantization_config=config)