abdelaziz-mahdy / pytorch_lite

flutter package to help run pytorch lite models classification and YoloV5 and YoloV8.
MIT License
51 stars 22 forks source link

Build failed (Version 3.0.4) #28

Closed MichaelRinger closed 1 year ago

MichaelRinger commented 1 year ago

Hello,

i am getting this error after including the package and building:

image

Thanks for the help in advance ;)

abdelaziz-mahdy commented 1 year ago

Are you using another package with this one? They are conflicting during the build process

Also try flutter clean and build again

MichaelRinger commented 1 year ago

@zezo357 No, there were no package conflicts during the build. I also tried installing the package on a new flutter project and the same error occured.

abdelaziz-mahdy commented 1 year ago

Can you show me your pubspec?

And the GitHub actions build works correctly, so I really need a repo to check why it fails in your case

abdelaziz-mahdy commented 1 year ago

Also did you do flutter clean?

MichaelRinger commented 1 year ago

Flutter clean didnt help. It might be problem with my android sdk, but reinstalling the sdk didnt help as well. I am using cmake 3.18.1 and I got this error:

image

abdelaziz-mahdy commented 1 year ago

Check this issue https://github.com/zezo357/pytorch_lite/issues/24

MichaelRinger commented 1 year ago

@zezo357 My fault, haven't seen this solved issue. Seems to work now altough with an inference of 1.0 s quite slowly for an yolov8s model.

abdelaziz-mahdy commented 1 year ago

@zezo357 My fault, haven't seen this solved issue. Seems to work now altough with an inference of 1.0 s quite slowly for an yolov8s model.

Share your inference logs, I want to know the limitations in performance in your case

MichaelRinger commented 1 year ago

@zezo357 Are these the correct logs?

image

abdelaziz-mahdy commented 1 year ago

Yes, looks like the model inference is slow in your device, meanwhile the decode is not slow,

Which is weird since it was the other way around in the another issue posted here

abdelaziz-mahdy commented 1 year ago

version 4.0.0 is released (which decreases decoding time)

but for your case i think its limited by mobile cpu, or model size