wuyifan18 / DeepLog

Pytorch Implementation of DeepLog.
MIT License
361 stars 154 forks source link

RuntimeError: required keyword attribute 'name' has the wrong type #36

Closed Huhu-ooo closed 4 years ago

Huhu-ooo commented 4 years ago

runfile('/Users/mandevay/LogKey_Model_Train.py', wdir='/Users/mandevay') Number of sessions(hdfs_train.txt): 4856 Number of seqs(hdfs_train.txt): 46573 Traceback (most recent call last):

File "/Users/mandevay/LogKey_Model_Train.py", line 99, in writer.add_graph(model, seq)

File "/anaconda2/envs/py36/lib/python3.6/site-packages/torch/utils/tensorboard/writer.py", line 707, in add_graph self._get_file_writer().add_graph(graph(model, input_to_model, verbose))

File "/anaconda2/envs/py36/lib/python3.6/site-packages/torch/utils/tensorboard/_pytorch_graph.py", line 295, in graph list_of_nodes = parse(graph, trace, args)

File "/anaconda2/envs/py36/lib/python3.6/site-packages/torch/utils/tensorboard/_pytorch_graph.py", line 219, in parse attr_name = node.s('name')

RuntimeError: required keyword attribute 'name' has the wrong type

Settings: python==3.6 pytorch==1.4.0 tensorboard=2.0.0

When I first ran the code, I ran into this error and cannot found any solution via Google. Could you give me any suggestions?Thank you.

wuyifan18 commented 4 years ago

Sorry, I try the same version as you, but I have not encountered this problem...

Huhu-ooo commented 4 years ago

Thanks for your response! In fact, I still have no idea why it happened and after I run this code on another computer, it work…… Then I try to understand the details of output so I add some codes and print the following in the quote. In the paper, the output of LogKey is the conditional proibability, but the output of the code is some decimals. Could you give me some guides? Thank you very much!

ADD CODES

Test the model

start_time = time.time() with torch.no_grad(): for line in test_normal_loader: for i in range(len(line) - window_size): seq = line[i:i + window_size] label = line[i + window_size] seq = torch.tensor(seq, dtype=torch.float).view(-1, window_size, input_size).to(device) label = torch.tensor(label).view(-1).to(device) output = model(seq) print("this is the output of every normal sample:") print(output) print(output.shape) predicted = torch.argsort(output, 1)[0][-num_candidates:] print("this is the predicted of every normal sample:") print(predicted) for i in predicted: print(output[0][i]) print("\n") if label not in predicted: FP += 1 break

PRINT this is the output of every normal sample: tensor([[-4.3343, 6.6010, 6.7188, 5.5074, -3.1904, -2.5582, -3.9527, -3.7992, -5.8223, -3.8012, -2.9275, -4.1057, -3.0611, -3.4921, -4.2829, -0.1479, -4.0095, -1.6966, -3.3788, -4.1235, -3.4248, -1.4330, 7.0985, -3.5261, -1.2776, -0.5570, -4.3929, -3.8631]]) torch.Size([1, 28]) this is the predicted of every normal sample: tensor([17, 21, 24, 25, 15, 3, 1, 2, 22]) tensor(-1.6966) tensor(-1.4330) tensor(-1.2776) tensor(-0.5570) tensor(-0.1479) tensor(5.5074) tensor(6.6010) tensor(6.7188) tensor(7.0985)

wuyifan18 commented 4 years ago

@Mandevay you can use this code to get probability. Actually, it won't affect the results. prob = torch.nn.functional.softmax(output, dim=1) print(prob)

Huhu-ooo commented 4 years ago

Ok, Thank you very much^^