Closed Shun14 closed 5 years ago
@Shun14 Well done
Great! Thanks!
@Shun14 when I use your upgraded code to test a file, I happend to meet this error:
Traceback (most recent call last):
File "main.py", line 447, in
@Shun14 when I use your upgraded code to test a file, I happend to meet this error: Traceback (most recent call last): File "main.py", line 447, in decode_results = load_model_decode(model_dir, data, 'raw', gpu, seg) File "main.py", line 354, in load_model_decode speed, acc, p, r, f, pred_results = evaluate(data, model, name) File "main.py", line 152, in evaluate gaz_list,batch_word, batch_biword, batch_wordlen, batch_wordrecover, batch_char, batch_charlen, batch_charrecover, batch_label, mask = batchify_with_label(instance, data.HP_gpu, True) File "main.py", line 195, in batchify_with_label mask[idx, :seqlen] = torch.Tensor([1]*seqlen.cpu().numpy().tolist()) AttributeError: 'int' object has no attribute 'cpu' Could you help me find out where is wrong? thanks
Maybe your pytorch version is not 0.4.1.I modified the code
[1]*seqlen.cpu().numpy().tolist()
whose original code like this [1]*seqlen
because of the difference between two versions of the pytorch.
@Shun14 This time I'm sure I use Pytorch 0.4.1,but python 3.6 However, I get plenty of wanrnings as well as error like this: fout.write(content_list[idx][0][idy].encode('utf-8') + " " + predict_results[idx][idy] + '\n') TypeError: can't concat str to bytes are you sure there is nothing wrong when you run the code?
Oh I ran the test for the status of train and test.The error is about encoding. You can modified this part like this one to solve it.fout.write(str(content_list[idx][0][idy].encode('utf-8')) + " " + predict_results[idx][idy] + '\n')
or this one fout.write(content_list[idx][0][idy] + " " + predict_results[idx][idy] + '\n')
@Shun14 thanks! and I use "warnings.filterwarnings" to fiter the warnings which makes it looks better! Anyhow, thank you for your help
As title shows, I upgraded the code.If you want to use the code on pytorch 0.4.1 ,please refer to this. new_version