Open arnabkmondal opened 1 year ago
Thanks for the question, https://github.com/QueuQ/CGLB/blob/f51401600d869778d482caf98a151fd4fbe83ad6/NCGL/Backbones/utils.py#L136-L137 In the line above, the remove_edge is used to remove the edges from the retrived subgraphs, therefore the inter-task edges will not participate in methods like ergnn.
Besides, it is true that the entire graph should not be available during training. The code you quoted only retrived the stored node ids, which means only these buffered node ids are available for memory replay. Therefore, although the model retrives the nodes from the datasets everytime for memory replay, since it only gets the previously stored ids, it is same as storing the nodes and does not access the entire datasets anymore.
In the
observe_class_IL_batch
function of ergnn_model.py while sampling the subgraph corresponding to task idt > 0
the code seems to sample the subgraph from the entire dataset. However,Therefore, this code seems to take an extra advantage in the class incremental setting without inter-edge connections in
pipeline_class_IL_no_inter_edge_minibatch
.Can you please clarify our concern?