-
The LightHuBERT is only available in the small size as the lighthubert code shows.
https://github.com/s3prl/s3prl/blob/540a4f86e9f099b240c314a56ff80b069e91d4cd/s3prl/upstream/lighthubert/expert.py#L1…
-
Hello, I have a question about 10-hour ASR fine-tuning in your paper.
Can you give me a procedure about this experiment? (or the link I can refer)
I just want to conduct the my own experiments for…
-
Hi,
I'm trying to reproduce lighthubert_stage1 and lighthubert_small, but got a big performance gap... Could you please supply more details of your training process (such as lr, scheduler or loss fun…
-
Hi guys,
Thank you for this amazing repository and all the work that has been done!
I was looking at issue #268 and saw that some models were fine-tuned w.r.t the learning rate parameter.
Do yo…
-
Hello Mr. Wang!
First of all, I would like to thank you for your work and effort to make it open source.
I've been working on the robustness of SRL models and I'm trying to reproduce the downstrea…
-
In the SUPERB Benchborad, I found these LightHuBERT models did not report the MACs and (1)~(4). And I want to report as the [official profiling tool](https://superbbenchmark.org/challenge-slt2022/subm…
-
`python run_downstream.py -m train -n ./frank_test -u hubert -d speech_commands
################################################################################
### WARNING, path does not exist: KAL…
-
Hello!
Thanks for the great work!
My colleague @edward0804 and I are thinking about integrating lighthubert into S3PRL to enable more research.
Instead of copying all the lighthubert code into S3…