Is it possible to port a trained gnn model to TensorFlow Lite (TFLite) for use in Java/Android applications?
If so, how are the input and output buffers declared? For instance, with a regular TFLite model, we typically use FloatBuffer or IntBuffer in Java/Android.
Hello,
Is it possible to port a trained gnn model to TensorFlow Lite (TFLite) for use in Java/Android applications?
If so, how are the input and output buffers declared? For instance, with a regular TFLite model, we typically use FloatBuffer or IntBuffer in Java/Android.
Could you provide an example?
Thank you.