tensorflow / flutter-tflite

Apache License 2.0
462 stars 124 forks source link

Using GPU for tflite #205

Open NQHuy1905 opened 5 months ago

NQHuy1905 commented 5 months ago

Hi, i want to use mobile GPU to infer my model

I see document and use this code to load model

final options = InterpreterOptions();
      if (Platform.isAndroid) {
        options.addDelegate(GpuDelegateV2());
      }
final interpreter =
          await tfl.Interpreter.fromAsset('assets/MaskPose.tflite', options: options);

but it logout error like this

E/tflite (27005): PADV2: Operation is not supported.
E/tflite (27005): 85 operations will run on the GPU, and the remaining 5 operations will run on the CPU.
I/tflite (27005): Replacing 85 node(s) with delegate (TfLiteGpuDelegateV2) node, yielding 2 partitions for the whole graph.
E/tflite (27005): Can not open OpenCL library on this device - undefined symbol: clGetCommandBufferInfoKHR
E/tflite (27005): Falling back to OpenGL
E/tflite (27005): TfLiteGpuDelegate Init: No shader implementation for transpose
I/tflite (27005): Created 0 GPU delegate kernels.
E/tflite (27005): TfLiteGpuDelegate Prepare: delegate is not initialized
E/tflite (27005): Node number 90 (TfLiteGpuDelegateV2) failed to prepare.
E/tflite (27005): Restored original execution plan after delegate application failure.
I/flutter (27005): Invalid argument(s): Unable to create interpreter

so i change GpuDelegateV2() to XNNPackDelegate() and it work

what is different between these two and do XNNPackDelegate() use GPU to infer model

einsitang commented 3 weeks ago

same issue to me