-
I need to use tristate optionals for my mutations and ran into invalid code generation when enabling it.
I recreated the issue in a [simple project](https://github.com/thomas7D/ferry_tristate_bug).…
-
I've tried to use this package, but it is showing different symbols on the repeated footnotes. Image:
![image](https://user-images.githubusercontent.com/5805111/216783428-0a827a66-ff40-4901-8216-b5…
-
To subscribe for events the documentation in README.md differs from implementation
README.md suggests:
`// Register callback
MusicControls.subscribe(events);`
index.d.ts documents `@usage` as:
…
-
Hello, I am wondering if the inference code can run at mixed precision?
For example, if I modify `run_pretrained_openfold.py` line 254 from
`out = run_model(model, processed_feature_dict, tag, …
-
This proposal follows BigInt in TypeErrors on mixed operands, e.g., BigInt + BigDecimal -> TypeError. Although it is possible to losslessly convert any Number or BigInt to a BigDecimal, the strictness…
-
Hi,
my general question is there a difference in how the training is set up dependent on the type of thing is trained.
I try to train a style (old method) and multiple subjects (than took the …
-
It seems that JAX isn't just-in-time doing the bf16 conversion. Currently in Levanter, we do something like this:
```
def loss(m, x):
m = convert(m, bf16) # produces a sharded bf16 model
…
dlwh updated
8 months ago
-
Bij een export van rdf van een metadata record zit er schijnbaar niet-rdf xml mee in de rdf.
Dit gebeurt op meerdere plaatsen.
Ik vermoed hier een bug in de XSLT.
-
Hi, I'm trying out Grokfast in a LLM scenario. Mixed precision training is a commonly-used technique to save GPU memory usage and speedup training. The following code is an example for FP16 training.
…
-
Hi,
while looking into the `pkcs7` crate for a CMS usecase around the Apple world I discovered that their detached signatures use the BER indefinite length encoding for some of the `SEQUENCE` which…