-
### Description of the bug
Since updating the PrusaSlicer to 2.7 I'm having problems slicing big models. I press "Slice Now", it goes to 20% maybe and then crashes without any message. It works just …
-
```
It would be nice to have bigger models (with more accuracy) and also models for
lowercase text.
```
Original issue reported on code.google.com by `tbr...@gmail.com` on 26 Jan 2011 at 5:37
-
```
It would be nice to have bigger models (with more accuracy) and also models for
lowercase text.
```
Original issue reported on code.google.com by `tbr...@gmail.com` on 26 Jan 2011 at 5:37
-
When I was working on owlv2 training/fine-tuning, errors occurred below, all work under google colab.
```
!rm -rf *
!rm -rf .config
!rm -rf .git
!git clone https://github.com/google-research/scen…
-
# 🚀 Feature request
This is a discussion issue for training/fine-tuning very large transformer models. Recently, model parallelism was added for gpt2 and t5. The current implementation is for PyTor…
-
Hello, thanks for your work, I have two problems with the default model and would like to consult you:
1. Can only recognize English, where do I need to set it to recognize Chinese? Or do I need to r…
-
```
It would be nice to have bigger models (with more accuracy) and also models for
lowercase text.
```
Original issue reported on code.google.com by `tbr...@gmail.com` on 26 Jan 2011 at 5:37
-
```
It would be nice to have bigger models (with more accuracy) and also models for
lowercase text.
```
Original issue reported on code.google.com by `tbr...@gmail.com` on 26 Jan 2011 at 5:37
-
```
It would be nice to have bigger models (with more accuracy) and also models for
lowercase text.
```
Original issue reported on code.google.com by `tbr...@gmail.com` on 26 Jan 2011 at 5:37
-
When specifying the number of GPUs during inference, is it only for parallelism or is the model loaded piece-wise over multiple GPUs, if it's bigger than individual GPUs? For example I'd like to use X…