Hi , I’ve been using the cosine_similarity_loss function from your repository, and I’d like to suggest a small modification to improve its consistency in reflecting loss values.
Current Implementation:
Rationale: The proposed modification changes the loss calculation to:
1.0 -(f1 * f2).sum(dim=1)
This change ensures that the loss values strive towards zero, making it more intuitive and consistent for gradient-based optimization. In scenarios where f1 and f2 are perfectly aligned, the cosine similarity will be 1, resulting in a zero loss, which is desirable for convergence.
I believe this adjustment will enhance the clarity and effectiveness of the loss function.
Thank you for considering this suggestion. I’m happy to discuss further if needed.
Hi , I’ve been using the cosine_similarity_loss function from your repository, and I’d like to suggest a small modification to improve its consistency in reflecting loss values. Current Implementation:
Proposed Modification:
Rationale: The proposed modification changes the loss calculation to:
1.0 -(f1 * f2).sum(dim=1)
This change ensures that the loss values strive towards zero, making it more intuitive and consistent for gradient-based optimization. In scenarios where f1 and f2 are perfectly aligned, the cosine similarity will be 1, resulting in a zero loss, which is desirable for convergence.I believe this adjustment will enhance the clarity and effectiveness of the loss function.
Thank you for considering this suggestion. I’m happy to discuss further if needed.