Closed oliverhvidsten closed 1 year ago
@oliverhvidsten does this problem persist on smaller data sets (e.g., 100 data points per property)? If so, would you be willing to share the smaller data set with me so I can reproduce and investigate?
After doing some testing, it appears that the information in graph_feats might not be accounted for while the model is training. The predictions are invariant across all of the non-chemical features.
Wow, you are totally right. There's a bug in the code. Above I linked a PR. Can you test it to see if it solves your issue?
Looks a lot better after the fix!
Multi-Task model with 9,100 points for one property and 6,200 points for the other property I have previously trained this dataset on polygnn_trainer and got much better results
I have attached:
Do you know why this behavior might be seen? I got very similar results on my first test and decided to increase some of the constants at the beginning. Those changes did not improve prediction accuracy.