-
Note that the mathematically correct interval should be
```
:math:`(-\infty,0)`
```
https://github.com/jax-ml/jax/blob/6790b90f91b0fbb8d818de8878f99eb3c4a871c2/jax/_src/nn/functions.py#L505
…
-
Is there a particular reason why softmax activation functions are not supported? I undertand they can be useful for a resource allocation scenario. As far as I can tell, adding it only requires a slig…
-
Hi, when running LayoutLM for document classification, how can I get the softmax output with the probability of each class, instead of just the name of the class with higher probability? This is the c…
-
https://github.com/collabora/WhisperSpeech/blob/80b268b74900b2f7ca7a36a3c789607a3f4cd912/whisperspeech/vq_stoks.py#L344
Why use log softmax on the model logits, but softmax on the teacher logits?
-
flashdecoding++ paper: https://arxiv.org/abs/2311.01282
- Q3 Collaboration Plan of Infra and IaaS Labs: https://bytedance.us.larkoffice.com/docx/HKXfdRh1noMrbAxcgL2ureGasdQ
- FlashDecoding++ Su…
-
Hello, Professor. I’m leaving an issue out of simple curiosity, even though it's the day of the exam.
While looking at the formula for InfoNCE Loss, you mentioned that tau was set to 1 for simplici…
-
-
`ZE_FLAT_DEVICE_HIERARCHY=FLAT`
![softmax-performance](https://github.com/user-attachments/assets/a108a666-f100-4ad2-b70c-12f4e5709ab2)
`ZE_FLAT_DEVICE_HIERARCHY=COMPOSITE`
![softmax-performance]…
-
### Search before asking
- [X] I have searched the YOLOv8 [issues](https://github.com/ultralytics/ultralytics/issues) and found no similar feature requests.
### Description
In the results object f…
-
# Bug Report
### Describe the bug
I use quantize_static and convert_float_to_float16 functions in onnx to convert fp32 model to fp16 + int8 model. The fp16 model can inference through onnxruntime …
Ai-ZL updated
1 month ago