Zhen-Dong / HAWQ

Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
MIT License
406 stars 83 forks source link

I have an Issue in test_resnet_inference.py #38

Open AlwaysHoon opened 2 years ago

AlwaysHoon commented 2 years ago

I try to run test_resnet_inference.py, but i have an issue about TypeError:'IntImm'. How can i solve it?

(qt) dmsl3@dmsl3:~/jh/HAWQ/tvm_benchmark$ python test_resnet_inference.py --model-dir ./fix_y/ File synset.txt exists, skip. Traceback (most recent call last):

File "/home/dmsl3/jh/HAWQ/tvm_benchmark/test_resnet_inference.py", line 127, in graph, lib, params = relay.build(func, target=TARGET_NAME, params=params)

File "/home/dmsl3/tvm/python/tvm/relay/build_module.py", line 251, in build graph_json, mod, params = bld_mod.build(mod, target, target_host, params)

File "/home/dmsl3/tvm/python/tvm/relay/build_module.py", line 114, in build target = _update_target(target)

File "/home/dmsl3/tvm/python/tvm/relay/build_module.py", line 47, in _update_target dev_type = tvm_expr.IntImm("int32", _nd.context(dev).device_type)

File "/home/dmsl3/tvm/python/tvm/runtime/ndarray.py", line 240, in context return TVMContext(dev_type, dev_id)

File "/home/dmsl3/tvm/python/tvm/_ffi/runtime_ctypes.py", line 175, in init self.device_type = device_type

TypeError: 'IntImm' object cannot be interpreted as an integer