teddykoker / torchsort

Fast, differentiable sorting and ranking in PyTorch
https://pypi.org/project/torchsort/
Apache License 2.0
765 stars 33 forks source link

Using TorchSort for TSP Solver #61

Closed kayuksel closed 1 year ago

kayuksel commented 1 year ago

Hello,

I am using torchsort.soft_rank to get ranked indices from the model output logits, and then calculate a loss function for TSP (Travelling Salesman Problem) as follows. (I had previously tried it with argsort but switched to soft_rank as it was not differentiable).

points = torch.rand(args.funcd, 2).cuda().requires_grad_()

def path_reward(rank):
    #a = points[rank]
    a = points.index_select(0, rank)
    b = torch.cat((a[1:], a[0].unsqueeze(0)))
    return pdist(a,b).sum()

rank = torchsort.soft_rank(logits, regularization_strength=0.001).floor().long() - 1

However, network weights are still not getting updated by optimizer. I suspect this may be because index_select, as I have read that it is non-differentiable with respect to the index. Could you recommend an alternative solution for solving TSP via soft_rank?

Happy new year.

Sincerely, Kamer

teddykoker commented 1 year ago

Hi @kayuksel, it's not clear to me exactly what you are trying to do; however you definitely won't be able to differentiate through the indexing operation. Furthermore, using floor() and long() will ruin the "soft"-ness of the ranking function as well. One possible way to structure your loss function would be if you could pre-compute the optimal ranks, and then use a differentiable spearman rank correlation as the loss between the predicted and optimal ranks.

I'm going to close this issue as it is not an actual issue with the torchsort library, but feel free to continue to reach out if you have more questions!

Happy new year :)

kayuksel commented 1 year ago

Thanks a lot for your response. Yeah, the only viable option seems to be pre-computing and predicting. I guess that instead, I can train a surrogate model, and get the gradients for the TSP solver from that.

Best wishes. Sincerely, Kamer