-
Due to insufficient computing resources in the laboratory, like two 3090s, do you have any recommended research directions, such as unsupervised learning or knowledge distillation as my research begi…
-
theres a bunch of examples in the python version, including documentation here https://jxnl.github.io/instructor/examples/
here are all the examples, only a few of them are interesting so use the c…
jxnl updated
9 months ago
-
New features concerning v4 will be added here.
### v4.0.x
- [x] Implement restore shaders for 1080p anime.
### v4.1.x
- [x] General exploration on whether using GANs can improve real-time upsc…
-
As the comments in [LINE_looped_runner_yolo.sh](https://github.com/NVlabs/DIODE/blob/yolo/scripts/LINE_looped_runner_yolo.sh) show, the authors use 28 gpus to generate a dataset in 48 hours.
Can yo…
DCNSW updated
2 years ago
-
Hello,Hello, is this knowledge distillation complete,I'm having some problems with it. It doesn't feel normal
-
### Title: [Curriculum Learning for Dense Retrieval Distillation](https://dblp.org/pid/279/6463.html)
### year: 2022
### Venue: SIGIR
### Main Problem
This research aims to improve the performan…
-
I am very interested in the paper Multi-Level Knowledge Distillation for Out-of-Distribution Detection in Text, but when I opened the code link provided in the paper, I found a 404 error. Could you pl…
-
https://wandb.ai/balthazarneveu/geosciences-segmentation
- [x] LR scheduler (plateau)
- [x] Validation accuracy
- [x] Validation Metric IoU, dice coeff
- [x] Augmentations (vertical reverse, hor…
-
Thank you for sharing the code.
Here I raise a little personal doubt.
What network are the teacher that undertakes distillation to learn and student respectively excuse me?
Transferring network kno…
-
## 简介
non-autogression decoding. 用iterative的方式不断refine已有的翻译结果。第一轮是完全NAT,然后再次基础上,选择confidence差的N个词,mask掉,去对mask的内容进行refine,迭代X轮结束。对mask的恢复过程类似BERT。整篇文章读起来很舒服,听起来也合理。后面实验部分也很饱满。
## 论文信息
* Author: F…