Open 32209056 opened 5 months ago
改成 token_ids, mask = self.tokenizer.encode(d['text'], maxlen=self.max_len)
试试
改成
token_ids, mask = self.tokenizer.encode(d['text'], maxlen=self.max_len)
试试
谢谢,已改通
改成
token_ids, mask = self.tokenizer.encode(d['text'], maxlen=self.max_len)
试试谢谢,已改通
不客气~
请问一下,token_ids,mask = self.tokenizer.encode( d['text'], max_length=self.max_len)这段代码报错,怎么修改
File "/ddnstor/imu_tzt1/sulisha/GRTE-main/main.py", line 48, in iter d['text'], max_length=self.max_len TypeError: encode() got an unexpected keyword argument 'max_length'