Shengjiewang-Jason / EfficientZeroV2

[ICML 2024, Spotlight] EfficientZero V2: Mastering Discrete and Continuous Control with Limited Data
GNU General Public License v3.0
44 stars 7 forks source link

Suggestion to Modify Cosine Similarity Loss Function for Improved Consistency #8

Open DAigory opened 1 month ago

DAigory commented 1 month ago

Hi , I’ve been using the cosine_similarity_loss function from your repository, and I’d like to suggest a small modification to improve its consistency in reflecting loss values. Current Implementation:

def cosine_similarity_loss(f1, f2):
    f1 = F.normalize(f1, p=2., dim=-1, eps=1e-5)
    f2 = F.normalize(f2, p=2., dim=-1, eps=1e-5)
    return -(f1 * f2).sum(dim=1)

Proposed Modification:

def cosine_similarity_loss(f1, f2):
    f1 = F.normalize(f1, p=2., dim=-1, eps=1e-5)
    f2 = F.normalize(f2, p=2., dim=-1, eps=1e-5)
    return 1.0-(f1 * f2).sum(dim=1)

Rationale: The proposed modification changes the loss calculation to: 1.0 -(f1 * f2).sum(dim=1) This change ensures that the loss values strive towards zero, making it more intuitive and consistent for gradient-based optimization. In scenarios where f1 and f2 are perfectly aligned, the cosine similarity will be 1, resulting in a zero loss, which is desirable for convergence.

I believe this adjustment will enhance the clarity and effectiveness of the loss function.

Thank you for considering this suggestion. I’m happy to discuss further if needed.

Shengjiewang-Jason commented 1 month ago

That is a good idea. Thank you for your kind suggestion. We will run some experiments to verify it.