Open leoluopy opened 4 years ago
for some additional information , i have tried using ncnn::set_cpu_powersave(2) , the function returns -1 , it's not working.
Are you setting the net option to perform the inference on the vulkan gpu? option.use_vulkan_compute = true; From my experience at first the gpu inference is relatively slow, but after a few warmups the inference time decrease to a stable time.
Are you setting the net option to perform the inference on the vulkan gpu? option.use_vulkan_compute = true; From my experience at first the gpu inference is relatively slow, but after a few warmups the inference time decrease to a stable time.
yes, i have set the option : option.use_vulkan_compute = true;
i am running a centerface in my desktop and using ncnn GPU (vulkan as backend) for inference. i meet the following problem: as i first launched the program , inference costs about 6-8 ms , but after about 20 secs , inference costs about 20-40 ms and never goes back ? do any one meet this problem ? any suggestions ?