Open zou-longkun opened 4 years ago
Sorry about the delay, and thanks for your questions @zou-longkun!
According to the paper, the source domain is the remaining relationships from the top 50 in VG, which means more than thousand relationships. So shouldn't these decision trees are designed deep?
To clarify, the "remaining relationships" over which we learn the DTs are the set of 50 relationships minus chosen relationships (see Figure A1 in the appendix).
Moreover,I want to know whether the Few Labeled Relationships covers all relationships from source domain
Yes they do. Some have very few labels, but these classes should all have at least a handful of ground truth labels.
Last, I can`t understand the meaning of the threshold "2 * random", can you explain it more detail?
We empirically found that a threshold around 2x random chance performance.
I have some questions on heuristics in the paper. According to the paper, the source domain is the remaining relationships from the top 50 in VG, which means more than thousand relationships. So shouldn't these decision trees are designed deep? Then, learning a generative model to combine J different heuristics, which seems like ensemble learning, right? Moreover,I want to know whether the Few Labeled Relationships covers all relationships from source domain. Last, I can`t understand the meaning of the threshold "2 * random", can you explain it more detail?
Looking forward to your reply