-
I do not quite understand the index, why depth graph *1000 was greatly standardized when data was loaded, and mae and mse were converted to "cm" at last, which seems to be different from the calculati…
-
# MOON
* **Title:** Model-Contrastive Federated Learning
* **Venue:** CVPR 2021
* **Link to paper:** https://openaccess.thecvf.com/content/CVPR2021/html/Li_Model-Contrastive_Federated_Learning_CVPR_2…
-
### Please check that this issue hasn't been reported before.
- [X] I searched previous [Bug Reports](https://github.com/OpenAccess-AI-Collective/axolotl/labels/bug) didn't find any similar reports.
…
-
bag3++ claim full open sourced, but it looks like this header file is close sourced.
-
This issue is a placeholder for the whole branch of the workflow that will occur after we've identified the versions and permissions for all of a publication's files and determined that we don't have …
-
Proposal: Code has been written to accept any Prompter. We should allow this to be configurable using a cfg or kwarg
https://github.com/OpenAccess-AI-Collective/axolotl/blob/bbfc333a0136bfcf3f21299…
-
-
### Please check that this issue hasn't been reported before.
- [X] I searched previous [Bug Reports](https://github.com/OpenAccess-AI-Collective/axolotl/labels/bug) didn't find any similar reports…
-
The paper said: "The proof of the initialization is provided in the supplementary material.", but i I didn't find
-
Hello,
As we discussed on discord, pretraining is usually done on large chunks of text (books, entire web pages, articles) that are larger than the context size. With this in mind I propose that we…