-
Hello, we are trying to check the Bose-Hubbard model for some trivial cases using the NetKet implementation of this model and the example code provided. For example for 0 hopping and just 1 boson we e…
miedc updated
4 years ago
-
Hi, I have two questions about your work:
1. What is the difference between OWM and [GEM algorithm](http://papers.nips.cc/paper/7225-gradient-episodic-memory-for-continual-learning.pdf)? GEM projects…
-
Dear @grosenberger , @uweschmitt , @hroest ,
After LDA fitted the train data, we need to score the test data using the LDA model params. As I know, there are two methods to calculate the scores:
…
-
Hi, congrats on the nice work.
I'm wondering if it's possible for you to release the details of Zero-Shot Retrieval Protocol on ImageNet100.
In Table 2 of the paper, the protocol refers to [28…
kunhe updated
5 years ago
-
Hello,
I'm thinking about adding a plot.BTM function to my BTM package using ggraph. BTM is good for clustering text (https://cran.r-project.org/web/packages/BTM/index.html).
In order to have a go…
-
Hi,
I'm a master student from China. Recently I ported this libraty to [RT-Thread](http://github.com/rt-thread/rt-thread), a burgeoning RTOS in China, so that this library can be used on STM32, RIS…
-
I shall be glad if you could add the paper published at ECML 2018 https://link.springer.com/chapter/10.1007/978-3-030-10928-8_50 in your list.
The algorithm name is "GCA".
The title of the paper…
-
> The coefficient matrix is saved in the variable "C" in a .mat file in the models_DSC folder (see the last line of the code). As to the digits, have you changed the kernel size, and hidden units to m…
-
I train on the digit dataset,the result is bad:
![image](https://user-images.githubusercontent.com/20330704/59171069-de626100-8b73-11e9-8082-8f322649bcd4.png)
the result decreases as the epoch gro…
-
A paper on gradient-based meta-learning
Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace
Yoonho Lee, Seungjin Choi
ICML 2018
paper: https://arxiv.org/abs/1801.05558
code:…