KevinMusgrave / pytorch-metric-learning

The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
https://kevinmusgrave.github.io/pytorch-metric-learning/
MIT License
5.96k stars 656 forks source link

PNPLoss #673

Closed celsofranssa closed 9 months ago

celsofranssa commented 10 months ago

On the PNPLoss:

losses.PNPLoss(b=2, alpha=1, anneal=0.01, variant="O", **kwargs)

I could not find an explanation of the parameters b, alpha, and anneal on the documentation, code, related issues (#589) nor even reading the referenced paper.

It would be great if you could give me some insights about these parameters and how to tune them.

celsofranssa commented 10 months ago

And why can't I use another distance with PNPLoss?

AssertionError('PNPLoss requires the distance metric to be CosineSimilarity')
KevinMusgrave commented 10 months ago

Sorry about that.

@interestingzhuo we should add an explanation of the parameters.

interestingzhuo commented 10 months ago

Sorry about that.

@interestingzhuo we should add an explanation of the parameters.

ok

interestingzhuo commented 10 months ago

And why can't I use another distance with PNPLoss?

AssertionError('PNPLoss requires the distance metric to be CosineSimilarity')

We only implement the cosine similarity, you can also add another distance.

interestingzhuo commented 10 months ago

On the PNPLoss:

losses.PNPLoss(b=2, alpha=1, anneal=0.01, variant="O", **kwargs)

I could not find an explanation of the parameters b, alpha, and anneal on the documentation, code, related issues (#589) nor even reading the referenced paper.

It would be great if you could give me some insights about these parameters and how to tune them.

The best setting is b=2, alpha=8, anneal=0.01, variant=""Dq"",

KevinMusgrave commented 10 months ago

What is the meaning of each parameter?

KevinMusgrave commented 10 months ago

Or what is the effect of each parameter?

interestingzhuo commented 10 months ago

variant can be ["Ds", "Dq", "Iu", "Ib", "O"]. It defines the "Variants of PNP". The best is Dq. alpha: the Hyperparameter of PNP-Dq, The best is 8. anneal: the best is 0.01, which was studied by smooth AP b: The boundary of PNP-Ib

interestingzhuo commented 10 months ago

Sorry about that.

@interestingzhuo we should add an explanation of the parameters.

How to edit the "https://kevinmusgrave.github.io/pytorch-metric-learning/losses/"?

KevinMusgrave commented 10 months ago

Edit this file: https://raw.githubusercontent.com/KevinMusgrave/pytorch-metric-learning/master/docs/losses.md

Search for ## PNPLoss and add some parameter descriptions like:

**Parameters**

- **variant**: ...

- **alpha**: ...
celsofranssa commented 10 months ago

And why can't I use another distance with PNPLoss?

AssertionError('PNPLoss requires the distance metric to be CosineSimilarity')

We only implement the cosine similarity, you can also add another distance.

Sure, that's the idea behind the pytorch-metric-learning. However, someone has placed an assertion on PNP loss initialization to ensure that the distance is CosineSimilarity:

    def __init__(self, b=2, alpha=1, anneal=0.01, variant="O", **kwargs):
        super().__init__(**kwargs)
        c_f.assert_distance_type(self, CosineSimilarity)
        self.b = b
        self.alpha = alpha
        self.anneal = anneal
        self.variant = variant
        if self.variant not in self.VARIANTS:
            raise ValueError(f"variant={variant} but must be one of {self.VARIANTS}")
celsofranssa commented 10 months ago

And why can't I use another distance with PNPLoss?

AssertionError('PNPLoss requires the distance metric to be CosineSimilarity')

We only implement the cosine similarity, you can also add another distance.

Sure, that's the idea behind the pytorch-metric-learning. However, someone has placed an assertion on PNP loss initialization to ensure that the distance is CosineSimilarity:

    def __init__(self, b=2, alpha=1, anneal=0.01, variant="O", **kwargs):
        super().__init__(**kwargs)
        c_f.assert_distance_type(self, CosineSimilarity)
        self.b = b
        self.alpha = alpha
        self.anneal = anneal
        self.variant = variant
        if self.variant not in self.VARIANTS:
            raise ValueError(f"variant={variant} but must be one of {self.VARIANTS}")

Hello @interestingzhuo and @KevinMusgrave, Any workaround here?

KevinMusgrave commented 10 months ago

Accommodating other distance functions might require special logic in the loss computation. With some loss functions like ContrastiveLoss or TripletMarginLoss this is very easy.

I don't know enough about PNPLoss to say how much it currently depends on the assumption that the distance function is CosineSimilarity.

interestingzhuo commented 10 months ago

And why can't I use another distance with PNPLoss?

AssertionError('PNPLoss requires the distance metric to be CosineSimilarity')

We only implement the cosine similarity, you can also add another distance.

Sure, that's the idea behind the pytorch-metric-learning. However, someone has placed an assertion on PNP loss initialization to ensure that the distance is CosineSimilarity:

    def __init__(self, b=2, alpha=1, anneal=0.01, variant="O", **kwargs):
        super().__init__(**kwargs)
        c_f.assert_distance_type(self, CosineSimilarity)
        self.b = b
        self.alpha = alpha
        self.anneal = anneal
        self.variant = variant
        if self.variant not in self.VARIANTS:
            raise ValueError(f"variant={variant} but must be one of {self.VARIANTS}")

Hello @interestingzhuo and @KevinMusgrave, Any workaround here?

We only test the CosineSimilarity for PNP Loss, and other distance will be added soon.

KevinMusgrave commented 9 months ago

The documentation has been updated to provide a bit more context: https://kevinmusgrave.github.io/pytorch-metric-learning/losses/#pnploss

Let us know if you have more questions.