automl / nas_benchmarks

BSD 3-Clause "New" or "Revised" License
91 stars 25 forks source link

UCI dataset .npy used for fcnet results #20

Closed rubinxin closed 4 years ago

rubinxin commented 4 years ago

Dear Dr. Klein,

Many thanks for the benchmark datasets on CNN and Fully-connected networks. I'm trying to reproduce some of the results of the fully connected networks on protein structure uci dataset using your script "train_fcnet.py". However, I got quite different performance (valid_mse=0.5) as compared to the results reported in the tabular benchmark dataset (valid_mse=0.3) on the same configuration/hyperparamters.

Is it possible for you to share the train/valid/test.npy files you used for the fcnet tabular benchmark dataset? Because that's the only variant in my script (btw, I also remove the redundant feature and normalise the data as instructed in your paper)?

I'd be really grateful to hear your reply. Once again, thanks for the datasets.

Best, Robin Oxford

aaronkl commented 4 years ago

Are these numbers averaged over multiple trials? Due to the intrinsic randomness of the neural network training, some configurations exhibit a high variance between the training runs. Datasets will follow per mail.

rubinxin commented 4 years ago

Millions of thanks for the datasets!

horsehour commented 3 years ago

Are these numbers averaged over multiple trials? Due to the intrinsic randomness of the neural network training, some configurations exhibit a high variance between the training runs. Datasets will follow per mail.

Hi Klein,

I encountered the same issue. On the protein-structure dataset, I used the best configuration and retrain the model from scratch for 100 epochs. It resulted in a validation error of 0.33 while the final validation error presented the benchmarks is 0.2. Could you please send me a copy of the npy datasets?

Best, Chunheng Jiang

aaronkl commented 3 years ago

Can you ping me via email?

horsehour commented 3 years ago

Can you ping me via email?

My email is jiangchunheng@gmail.com. Thank you very much.