awslabs / dgl-lifesci

Python package for graph neural networks in chemistry and biology
Apache License 2.0
730 stars 151 forks source link

How to deal with using F.softmax and F.log_softmax activation functions now that Implicit dimension choice is deprecated #214

Closed BJWiley233 closed 1 year ago

BJWiley233 commented 1 year ago

If I want to use for instance NLLLoss loss function with an exit softmax activation function (see "The Negative Log-Likelihood Loss function (NLL) is applied only on models with the softmax function as an output activation layer") how do deal with the deprecation warning in the future

# define GCN NET with 2 GCN layers
gcn_net = GCNPredictor(in_feats=n_feats,
                    hidden_feats=[128,64],
                    n_tasks=2,
                    predictor_hidden_feats=10,
                    dropout=[0.5, 0.2],
                    activation=[F.relu, F.log_softmax])
...
# example forward
i, (bg, labels) = list(enumerate(train_loader))[0]
labels = labels.to(device)
atom_feats = bg.ndata.pop('h').to(device)
atom_feats, labels = atom_feats.to(device), labels.to(device)
pred_train = gcn_net(bg, atom_feats)

Warning

/home/coyote/miniconda3/envs/develop/lib/python3.8/site-packages/dgl/nn/pytorch/conv/graphconv.py:450: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
  rst = self._activation(rst)
/home/coyote/miniconda3/envs/develop/lib/python3.8/site-packages/dgllife/model/gnn/gcn.py:103: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
  res_feats = self.activation(self.res_connection(feats))
mufeili commented 1 year ago

Try something as follows.

from functools import partial

partial(F.log_softmax, dim=X)
BJWiley233 commented 1 year ago

thanks, forgot to think about that