Open yunhao-qian opened 6 months ago
depth_limit
is set in hector train
, but by a somewhat convoluted path: a Predictor
is constructed with the parameter embedding_steps
set from the command line option --embedding-steps
(default value: 4), and then predictor.depth_limit
is passed to TrainingData.from_parameters
. In the Predictor
, the depth_limit
attribute is defined to just expose the embedding_steps
paramter.BatchNorm
module was left in by mistake.
Hi, thank you for the excellent paper and open sourcing of the project. I am working on a new project that modifies and uses your code, and here are two questions I have.
depth_limit
option for controlling the depth of sub-ePDGs at runtime, but this option is unused in thehector train
command line, which means that the sub-ePDG of a manifestation point always includes all its predecessor nodes. There might be situations where a malicious sub-ePDG is a subgraph of another sub-ePDG labelled benign. Are these situations expected, or we should avoid them early in the data preprocessing step?model.py
has an orphanBatchNorm
module left unused. Was there any reason behind that change? Or has it been replaced because the normalization performed byhector feature_stats
achieves a similar goal?