LechengKong / OneForAll

A fundational graph learning framework that solves cross-domain/cross-task classification problems using one model.
MIT License
161 stars 22 forks source link

About zero shot task #17

Closed W-rudder closed 4 months ago

W-rudder commented 4 months ago

I think this is a fantastic work! And it's been incredibly interesting for me! However, I have a question that I'd like to trouble you with. I'm curious about the transferability of the model, for instance, training the model on the training set from arXiv in a supervised learning setup, and then testing it on the test sets from PubMed and Cora. Can the existing code achieve this? Specifically, how should I modify the configuration file? Looking forward to your reply! Wishing you smooth work and a happy life!

W-rudder commented 4 months ago

Is this the correct way to modify it? Forconfig/task_config.yaml


arxiv_transfer: &arxiv_transfer
  <<: *E2E-node
  dataset: arxiv
  eval_set_constructs:
    - stage: train
      split_name: train
      dataset: arxiv
    - stage: valid
      split_name: valid
      dataset: arxiv
    - stage: test
      split_name: test
      dataset: arxiv
    - stage: test
      split_name: test
      dataset: pubmed_node
    - stage: test
      split_name: test
      dataset: cora_node
LechengKong commented 4 months ago

Hi @W-rudder , yes, this is the right config.

W-rudder commented 4 months ago

Hi @W-rudder , yes, this is the right config.

Thanks for your reply! I learned a lot from your paper and codes. Best wishes to you^_^