-
-
Hi, would it be possible to release a smaller version of the ReID model, or at least the training code so I can train it myself?.
I've been trying to reduce the inference time but to no avail.
-
Dear author, I want to express my appreciation for this project. Your work has been incredibly valuable to me.
I learned that you have plans for releasing smaller v2 versions of the model from REA…
-
Trying set one of 'vgg16', 'vgg16bn', 'resnet50', changing self.bb to 0 or 1 or 2, but getting `RuntimeError: Given groups=1, weight of size [64, 3712, 3, 3], expected input[1, 1856, 128, 128] to have…
-
Which smaller models can be used, the weights are too big for colab, especially the stable diffusion one.
-
The inference time is way to high, we should try to use a much smaller model from Ollama:
* dolphin-phi (3B uncensored dolphin model)
-
Hello,
could we please have 13b and 7b models with the updated architecture that includes grouped query attention? A lot of people are running these models on machines with low memory and this woul…
-
Right now, all of our explorers fail if the initialization state has infinite target logpotential. When this is not due to coding issues, the most likely cause is that the support of the likelihood is…
-
I own a Prusa mini+ which can print ~180mm x 180mm x 180mm which is smaller than the required 210mm.
If this model and guide could be updated to also support a smaller model, I could print and set…
-