Open Jackycheng0808 opened 7 months ago
@onnx_op(op_type="GPT2Tokenizer", inputs=[PyCustomOpDef.dt_string], outputs=[PyCustomOpDef.dt_int64, PyCustomOpDef.dt_int64], attrs={"padding_length": PyCustomOpDef.dt_int64}) def bpe_tokenizer(s, **kwargs): padding_length = kwargs["padding_length"] input_ids, attention_mask = cls.tokenizer.tokenizer_sentence([s[0]], padding_length) return input_ids, attention_mask
op_type should be same as function name, change to @onnx_op(op_type="bpe_tokenizer", ...
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
@onnx_op(op_type="GPT2Tokenizer", inputs=[PyCustomOpDef.dt_string], outputs=[PyCustomOpDef.dt_int64, PyCustomOpDef.dt_int64], attrs={"padding_length": PyCustomOpDef.dt_int64}) def bpe_tokenizer(s, **kwargs): padding_length = kwargs["padding_length"] input_ids, attention_mask = cls.tokenizer.tokenizer_sentence([s[0]], padding_length) return input_ids, attention_mask
op_type should be same as function name, change to @onnx_op(op_type="bpe_tokenizer", ...
Document Details