Running the notebook with python 3 on a tx2 with JetPack 3.3. I followed the instructions and I am measuring the inference time as follow
from time import timestart = time()output = tf_sess.run(tf_output, feed_dict={tf_input: image[None, ...]})end = time()print("Inference time: {}s".format(end-start))scores = output[0]
Using the same examples as the notebook (inception_v1 etc), I got a inference time of 0.8 seconds, pretty far from the 7ms described.
I also used
sudo nvpmodel -m 0sudo ~/jetson_clocks.sh
Don't know if this is still relevant but the first inference is painfully slow (warmup). Usually you discard the first inference and average the next ten
Running the notebook with python 3 on a tx2 with JetPack 3.3. I followed the instructions and I am measuring the inference time as follow
from time import time
start = time()
output = tf_sess.run(tf_output, feed_dict={
tf_input: image[None, ...]
})
end = time()
print("Inference time: {}s".format(end-start))
scores = output[0]
Using the same examples as the notebook (inception_v1 etc), I got a inference time of 0.8 seconds, pretty far from the 7ms described. I also used
sudo nvpmodel -m 0
sudo ~/jetson_clocks.sh