-
-
Not sure of the best language to describe this succinctly, but in PyTorch, there are modules with the parameter "num_layers", e.g. nn.GRU and nn.LSTM. However, the helpers in torch_helpers seem to ass…
-
First off, I want to say thank you for this incredible work.
I do have a question regarding the EgoVLPv2 weights and EgoVLPv2 video features, can you please tell where can i find them?
-
Hi,
In your paper you mention that you allocated 2, 3, or 4 bits to each layer of the model using a criteria. But in Fig. 1(d): Construct LUT and Query&Add, the binary weights are shown to be 8-bit…
-
Traceback (most recent call last):
File "scripts/inference.py", line 100, in
run()
File "scripts/inference.py", line 39, in run
net = HyperInverter(opts)
File "./models/hyper_inver…
-
The density at a discrete point in the mixture should be `Inf`, cdf is fine.
```r
library(ggdist)
library(distributional)
library(ggplot2)
ggplot(NULL, aes(xdist = dist_mixture(dist_normal(), d…
-
The Leela master weights are no longer available on the Google drive link.
Do you have any more weights or just the one in the baduk megapack?
-
Hello, and thank you for the great work on this package. I would like to perform covariate balancing from inverse probability weights constructed with the `ipw` package to perform a marginal structura…
-
Thank you for your excellent work, I'd like to know whether model weights can be provided for pre-training.
-
Hello, I have to train a ST model on a very specific task, unfortunately my dataset is not big enough, therefore I was thinking to augment my data using an LLM. As my task is quite specific and demand…