Open nan1104 opened 3 years ago
Hi,
You can use mmf.utils.build.build_processors
function to build the processors from M4C config and register them to registry in your inference script. See https://github.com/facebookresearch/mmf/blob/273a56b903d4bcd73467ddaa3605cf147311471d/mmf/models/interfaces/mmbt.py#L41
I already add this line. But inference on m4c.py still can'f find self.answer_processor
@nan1104 You will need to register you answer processor as textvqa_answer_processor
so that M4C model can pick it up from registry. Check this code to understand what M4C model looks for: https://github.com/facebookresearch/mmf/blob/eef7028c82ee27d5ada9c04e05449ac6682554bd/mmf/models/m4c.py#L161
🐛 Bug
I want to write a inference for my m4c pretrained model based mmf/utils/inference.py. However I find a problem, when I run into m4c model's _forward_mmt_and_output function, to realize generate , it relay on self.answer_processor, but this answer_processor was registry under corresponding data builder, so when inference, it is none. How can I realize generate in inference and don't import data builder?
Traceback (most recent call last): File "inference.py", line 145, in
print(inf.forward(sample_info))
File "inference.py", line 130, in forward
output = self.model(sample_list)
File "/home/hadoop-aipnlp/cephfs/data/wangruonan/mmf/mmf/models/base_model.py", line 236, in call
model_output = super().call(sample_list, *args, *kwargs)
File "/home/hadoop-aipnlp/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(input, **kwargs)
File "/home/hadoop-aipnlp/cephfs/data/wangruonan/mmf/mmf/models/m4c_noobj.py", line 157, in forward
self._forward_mmt_and_output(sample_list, fwd_results)
File "/home/hadoop-aipnlp/cephfs/data/wangruonan/mmf/mmf/models/m4c_noobj.py", line 253, in _forward_mmt_and_output
fwd_results["prev_inds"][:, 0] = self.answer_processor.BOS_IDX
AttributeError: 'NoneType' object has no attribute 'BOS_IDX'