Closed HashedViking closed 3 years ago
Found a workaround
batch_size
to 1
predictions = animate(source_image,frames,generator,kp_detector,process_kp_driving,1,parser.relative,parser.adapt_movement_scale)
kp_driving = process_kp_driving(kp_driving,kp_source,relative,adapt_movement_scale)
- as it's still failing even with batch_saze == 1
that seems like a TFLite related issue with tflite.resize_tensor_input
Interesting, how did your code worked without any issues on your machine? Probably, you have Windows/Linux.
This looks like an issue where input indices for tf lite interpreter (process_kp_driving_kp_driving_index, etc.) don't match the correct inputs. I changed it so it would find the correct indices using input names. Can you pull and try running again?
Ok, now it works without commenting out step 3 (tested both with batch_size == 1
and batch_size == 4
)
#Step 3: process kp_driving
kp_driving = process_kp_driving(kp_driving,kp_source,relative,adapt_movement_scale)
But I've used tflite models built with previous version of build.py
because the new one fails at kp_detector_tflite = kp_detector_converter.convert()
Edit:
Successfully used new models after fixing build.py
adding tf.lite.OpsSet.SELECT_TF_OPS
to kp_detector_converter
Since testlite.py itself works and the remaining issue is with build.py, I'm closing this issue.
Sure, soon I'll try to run it on iOS, will report back.
@lshug I've successfully built
tflite
models, but:Running
testlite.py
on MacOS fails with this:test.py
andrun.py
work ok, though.Also I've added this at 35 line of
build.py
to bypass the error #3Commenting out this line at
animate.py
produces another errorMy env:
requirements.txt: