Open WeiHao97 opened 2 years ago
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.
Environment info
adapter-transformers
version: 3.0.1Information
Model I am using (Bert, XLNet ...): Roberta
Language I am using the model on (English, Chinese ...): N/A
Adapter setup I am using (if any): RobertaAdapterModel
The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
model = AutoAdapterModel.from_pretrained("roberta-base") adapter_name = model.load_adapter("qa/squad2@ukp", config="houlsby") model.set_active_adapters(adapter_name)
input_names = ["input_ids", "attention_mask"] sig = inspect.signature(model.forward) concrete_args = {p.name: None for p in sig.parameters.values() if p.name not in input_names} tracer = fx.HFTracer() traced_graph = tracer.trace(model, concrete_args=concrete_args) traced = torch.fx.GraphModule(model, traced_graph)
TraceError Traceback (most recent call last) Input In [3], in <cell line: 13>() 11 concrete_args = {p.name: None for p in sig.parameters.values() if p.name not in input_names} 12 tracer = fx.HFTracer() ---> 13 traced_graph = tracer.trace(model, concrete_args=concrete_args) 14 traced = torch.fx.GraphModule(model, traced_graph)
File ~/anaconda3/lib/python3.9/site-packages/transformers/utils/fx.py:475, in HFTracer.trace(self, root, concrete_args, method_names) 471 self._autowrap_function_ids.update(set([id(f) for f in autowrap_functions])) 473 self._patch_leaf_functions_for_root(root) --> 475 self.graph = super().trace(root, concrete_args=concrete_args) 477 self._patch_leaf_functions_for_root(root, restore=True) 479 _reset_tensor_methods(self.original_methods)
File ~/anaconda3/lib/python3.9/site-packages/torch/fx/_symbolic_trace.py:615, in Tracer.trace(self, root, concrete_args) 613 for module in self._autowrap_search: 614 _autowrap_check(patcher, module.dict, self._autowrap_function_ids) --> 615 self.create_node('output', 'output', (self.create_arg(fn(*args)),), {}, 616 type_expr=fn.annotations.get('return', None)) 618 self.submodule_paths = None 620 return self.graph
File ~/anaconda3/lib/python3.9/site-packages/transformers/adapters/models/roberta.py:84, in RobertaAdapterModel.forward(self, input_ids, attention_mask, token_type_ids, position_ids, head_mask, inputs_embeds, output_attentions, output_hidden_states, return_dict, head, kwargs) 81 pooled_output = outputs[1] 83 if head or AdapterSetup.get_context_head_setup() or self.active_head: ---> 84 head_outputs = self.forward_head( 85 head_inputs, 86 head_name=head, 87 attention_mask=attention_mask, 88 return_dict=return_dict, 89 pooled_output=pooled_output, 90 **kwargs, 91 ) 92 return head_outputs 93 else: 94 # in case no head is used just return the output of the base model (including pooler output)
File ~/anaconda3/lib/python3.9/site-packages/torch/fx/proxy.py:248, in Proxy.iter(self) 245 if inst.opname == 'UNPACK_SEQUENCE': 246 return (self[i] for i in range(inst.argval)) # type: ignore[index] --> 248 return self.tracer.iter(self)
File ~/anaconda3/lib/python3.9/site-packages/torch/fx/proxy.py:161, in TracerBase.iter(self, obj) 154 @compatibility(is_backward_compatible=True) 155 def iter(self, obj: 'Proxy') -> Iterator: 156 """Called when a proxy object is being iterated over, such as 157 when used in control flow. Normally we don't know what to do because 158 we don't know the value of the proxy, but a custom tracer can attach more 159 information to the graph node using create_node and can choose to return an iterator. 160 """ --> 161 raise TraceError('Proxy object cannot be iterated. This can be ' 162 'attempted when the Proxy is used in a loop or' 163 ' as a *args or **kwargs function argument. ' 164 'See the torch.fx docs on pytorch.org for a ' 165 'more detailed explanation of what types of ' 166 'control flow can be traced, and check out the' 167 ' Proxy docstring for help troubleshooting ' 168 'Proxy iteration errors')
TraceError: Proxy object cannot be iterated. This can be attempted when the Proxy is used in a loop or as a *args or **kwargs function argument. See the torch.fx docs on pytorch.org for a more detailed explanation of what types of control flow can be traced, and check out the Proxy docstring for help troubleshooting Proxy iteration errors
Expected behavior
This is because passing a boolean tensor to the if statement. So it seems to me that your repo does not support tracing on any heads while the transformer repo does because it does not have this if statement.