-
https://arxiv.org/pdf/1705.10667.pdf
Adversarial learning has been successfully embedded into deep networks to learn transferable features for domain adaptation, which reduce distribution discrepan…
leo-p updated
7 years ago
-
### Module 1: Neural Networks
각 강의 노트마다 번역해주셔야 할 파일은 다음과 같습니다.
- Image Classification: Data-driven Approach, k-Nearest Neighbor, train/val/test splits
- classification.md
- Linear classification: S…
-
Recently, there are two papers from [Ilya Loshchilov](http://ml.informatik.uni-freiburg.de/people/loshchilov/index.html) and [Frank Hutter](http://www2.informatik.uni-freiburg.de/~hutter/).
[SGDR: St…
-
Using CNTK brainScript I am training feedforward DNN having 1 output node with mean square error(MSE) criterion. "SquareError" block is used in NDL. When calculating stochastic gradient descent, I wou…
-
In the optimizer the parameter naming is not standardized due to the namedtuple parameters and the class arguments. There should be an easier way to create parameters.
-
I'm getting the error: "Invalid labels: target data is not either 0.0 or 1.0" which implies to me that only classification is supported
devyn updated
8 years ago
-
I think an updated readme will help to see how different components are connected, even if the structure of the project might change. I'll volunteer to do this since it will help me get a better sense…
-
Explanation of Cost and Loss function:
https://medium.com/@vinodhb95/what-is-loss-in-neural-nets-is-cost-function-and-loss-function-are-same-ef069a570e95
Explanation of Linear regression regarding…
-
This issue proposes to create delayed `pyro.param` by `log_joint` and then use `funsor.adam.Adam` to optimize parameters. This would allow to write optimization part in `minipyro` in a way that is mor…
-
depends on https://github.com/Daniel-Mietchen/ideas/issues/640 and https://github.com/Daniel-Mietchen/ideas/issues/641