Open rnwang04 opened 2 years ago
talked about this offline, we will enhance the get_forward_args
function
talked about with @TheaperDeng basic solution : https://github.com/intel-analytics/BigDL/blob/25ad426630e913d615188ca4fee670aa9521fd79/python/nano/src/bigdl/nano/utils/inference/pytorch/model_utils.py#L25 ->
forward_args = inspect.getfullargspec(model.forward).args
if forward_args[0] == 'self':
forward_args = forward_args[1:]
My calib dataloader is bsz = 1, after I quantize my model by
I get an error when I run model_int8 on dataloader which bsz=16, the error message is as follows : There are only one input x in the batch, and I found this error was caused because this line https://github.com/intel-analytics/BigDL/blob/f682c06c54f7bf4afffd4ce10789bee5564bc9c9/python/nano/src/bigdl/nano/utils/inference/pytorch/model_utils.py#L25 .