-
* [deep compression](https://arxiv.org/pdf/1605.07678.pdf): pruning and quantization , 35x reduction
* [squeezenet](https://arxiv.org/pdf/1602.07360.pdf): its [openreview](https://openreview.net/foru…
-
Is there a process for training this engine to play custom variants not included in the Python-chess module, or in Stockfish's supported variants? I'm looking to write a variant where the only rule mo…
-
Both tool participants and outsiders such as industry partners can propose benchmarks. All benchmarks must be in .onnx format and use .vnnlib specifications, as was done last year. Each benchmark must…
-
Post a link for a "possibility" reading of your own on the topic of Network & Table Learning [for week 6], accompanied by a 300-400 word reflection that: 1) briefly summarizes the article (e.g., as we…
lkcao updated
2 years ago
-
The paper "Likelihood-free parameter estimation with neural Bayes estimators" (Sainsbury-Dale, Zammit-Mangion, & Huser, 2023) enables neural amortized *point* estimation, which is generally faster tha…
-
Used the following PDF: https://arxiv.org/pdf/1706.03762
The result looks ok, however the order or pages is incorrect.
Setup
```
git clone ...
pip install -e .
```
```python
import asyn…
-
We want to compare the performance of a deblender, for instance scarlet, on parametric light profiles and light profiles generated using our model, hopefully demonstrating that the results from the ge…
EiffL updated
5 years ago
-
[GoNN](https://github.com/lightvector/GoNN) is a sandbox for agents similar to the Leela projects. Notable ideas tested in GoNN, at the time of creating this issue, are the following:
| "Cosme…
-
Hi,
@VainF Thank you very much for this project, great work !
I was wondering if you are planning on adding support for conv layer with arbitrary groups parameter (currently there is only suppor…
-
* paper: https://arxiv.org/abs/1711.06368
* +5 AP for small model, and + 2.8 AP for big model
* Implement conv Bottleneck-LSTM that gives +3-5 AP and very cheap ~+1% BFLOPS
![image](https://u…