Closed yuan-shuo closed 5 months ago
您好!我想请教一下关于直接下载的re_robert.pth模型,我将他放入目录后直接运行predict.py,输出的结果有些不太对劲,请问是要按文档步骤先执行run.py后模型才会好用吗?
warnings.warn("dropout option adds dropout after all but last " [2024-05-04 20:52:26,483][main][INFO] - model name: lm [2024-05-04 20:52:26,484][main][INFO] - LM( (bert): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(21128, 768, padding_idx=0) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0): BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) (bilstm): RNN( (rnn): LSTM(768, 50, batch_first=True, dropout=0.3, bidirectional=True) ) (fc): Linear(in_features=100, out_features=51, bias=True) (dropout): Dropout(p=0.3, inplace=False) ) [2024-05-04 20:52:29,070][main][INFO] - "男人的爱" 和 "人生长路" 在句中关系为:"毕业院校",置信度为1.00。
您好,参考README_CNSCHEMA_CN.md文件中的教程,无需运行run.py,可直接运行predict.py
README_CNSCHEMA_CN.md
predict.py
请问您的问题解决了吗
算是吧,谢谢你们,我可能是参数输错了吧
您好!我想请教一下关于直接下载的re_robert.pth模型,我将他放入目录后直接运行predict.py,输出的结果有些不太对劲,请问是要按文档步骤先执行run.py后模型才会好用吗?
输出
warnings.warn("dropout option adds dropout after all but last " [2024-05-04 20:52:26,483][main][INFO] - model name: lm [2024-05-04 20:52:26,484][main][INFO] - LM( (bert): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(21128, 768, padding_idx=0) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0): BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) (bilstm): RNN( (rnn): LSTM(768, 50, batch_first=True, dropout=0.3, bidirectional=True) ) (fc): Linear(in_features=100, out_features=51, bias=True) (dropout): Dropout(p=0.3, inplace=False) ) [2024-05-04 20:52:29,070][main][INFO] - "男人的爱" 和 "人生长路" 在句中关系为:"毕业院校",置信度为1.00。