-
Couldn't find activation function mish, going with ReLU
I've been getting this error whenever I try to run the new YoloV4
-
@AlexeyAB
-
You could increase SMIM accuracy by using Ranger, which combine state of the art optimizers + gradient centralization
https://github.com/lessw2020/Ranger-Deep-Learning-Optimizer
Hortogonally, you …
-
https://en.wikipedia.org/wiki/Painter%27s_algorithm
Would be nice to see a video on Painter's Algorithm, one of the ways in which we determine which surfaces are visible and which are hidden to the…
-
Hi everyone,
**Proposal:**
I would like to propose the addition of several other activation functions to the framework:
- [ ] LeakyReLU
- [ ] PReLU
- [ ] ReLU6
- [ ] Tanh
- [ ] Softplus
…
-
I have successfully trained a model on a Google Colab runtime, but when I try to export it, I get an error about `MishCudaFunction()`.
This is what I have done:
- `pip install torch==1.7.1+cu101…
-
When I run darknet_video.py with:
`./darknet detector demo data/obj.data cfg/yolov4-tiny-mish-custom.cfg myResults/yolov4-tiny-mish-custom_best.weights data/test50.mp4`
The video is too fast and fin…
-
Hi there,
I would like to experiment to deploy yolov4 on an embedded linux with a npu on it.
Unfortunately, the model conversion tool does not support mish activation function and turns it into …
-
### 🐛 Describe the bug
RuntimeError occurs during `torch.allclose` assertion, indicating a size mismatch between tensors at dimension 3 (size 9 vs. size 3) in a custom model's forward method. The err…
-
"new networks with attention policy head (such as T79 or ap-mish networks)". the new network: T78, T79 and T80 doent work for me. But the olders ok.