Closed touhi99 closed 1 year ago
So, according to the paper and in my understanding. Label embedding is useful for giving features to the label, so that the graph will be easier to the understand the relation between the graph. It like when you give pre-trained vector embedding to the word, the model hopefully will undestand the characteristic of the word through the vector so that the model will classify the classes better.
What is the purpose of label embedding in this context? Is label embedding is finding class name within the text and creating the embedding for those found class names in text? I am trying the same model in a different dataset. But unable to find the purpose of the label embedding. Could you clarify?
I am trying to do it with BERT model.
Thank you