-
# What is jax (just in case for other reader)
[jax](https://github.com/google/jax) is full numpy acceleration and autodiff with functional neural networks, and is essentially autograd 2.0. (from [Ita…
-
**Describe the bug**
Since you can only add new names and emails and cannot delete old ones if they are still in use in some of your papers, and since these are also revealed externally (all names an…
-
-
Every feedback is important for us to improve our project. We appreciate the instructor, TAs and peer reviewers' time to provide these valuable feedback.
Below we put together comments from reviewe…
-
I wanted to know the expected test perplexity for the lm1b base model. It would be especially great if you could upload the training log file if possible. I wanted to include the results in my ICML pa…
-
-
这里是免费的程序员招聘服务。
如果你们团队正在招人,欢迎把招聘信息发在这个帖子里面。请简要描述,岗位名称、工作地点、岗位要求、团队简介、联系方式等等。
**注意:同一个团队如果招聘多个岗位,请写在一起,不要分成多个部分张贴。**
读者可以咨询,但请不要发布与招聘无关的内容,禁止对公司或岗位进行评论或抱怨。如果有意应聘,请直接与发帖人联系。
谢绝中介和猎头发帖,违者拉黑。
…
-
it would be good to add https://sites.google.com/view/icml-2022-big-model/ to the website, because hard to find this good website.
ideally, i would like to put it here (
)
but not sure how to …
-
Is there a scheduled workshop at ICML for this, where the paper will be released and discussed?
-
This is a variety of edits we need to make in order to properly benchmark the OCaml code - e.g. inserting timing checks in the right places, logging intermediate results, etc.
We'll be using this f…