-
SHIP is a very interesting work some wide applications. Thank you for making the code public.
May I know how to generate the GZSL results shown in the paper. Can you provide the code base for the sa…
-
![image](https://user-images.githubusercontent.com/129525/63389681-d9c9ec80-c3ef-11e9-8297-73fd8162143e.png)
https://arxiv.org/pdf/1908.05832.pdf
silky updated
5 years ago
-
訓練データに含まれないラベルに対しても、テストデータを振り分けられるようにするGeneralized Zero-Shot Learningタスクのサーベイ論文
https://arxiv.org/abs/2011.08641
-
-
Hi, I did some experiment on MIT-States.
From my understanding, this is basically a kind of zero shot learning, and it is branded as "unseen combination" recognition task by Red Wine paper.
I have a…
-
Why does `unsenn` appear in the loss function even though it is not used during the training phase to mitigate bias?
-
Only unseen=8.8357, seen=87.9500, h=16.0581,
unseen class accuracy= 64.2205472255
May I ask which hyper parameter would influence the result so much?
-
when executing zero shot. the following probs is empty:
topics, probs = topic_model.fit_transform(docs, embeddings)
following configuration:
from umap import UMAP
umap_model = UMAP(n_neighbo…
-
Suggest a paper you would like us to discuss in our next meeting. You can upvote a paper using the :thumbsup: emoji.
The paper with the most upvotes by the end of the week (Friday) will be chosen.
-
# Vision Transformer Adapter for Dense Predictions
Info.
- ICLR 2023 spotlight
- https://github.com/czczup/ViT-Adapter
- https://arxiv.org/abs/2205.08534
### Summary
- plain ViT
- whi…