Navidfoumani / ConvTran

This is a PyTorch implementation of ConvTran
MIT License
123 stars 8 forks source link

Similarity graphs #3

Closed mpuu00001 closed 8 months ago

mpuu00001 commented 9 months ago

Dear authors,

Thank you for your work! It is very insightful, and I very much love your graphs. In the publication, you mentioned, "Figure 1a shows the dot product between two sinusoidal positional embedding whose distance is K using Eq. (5) with various embedding dimensions."

I am curious to know more about the standards you follow when selecting the two sinusoidal positional embeddings with the same embedding dimensions. Additionally, would it be possible for you to share the code for plotting these graphs publicly? If not, could you please provide more detailed steps on how you go about plotting the graphs in figures 1 and 2?

Thank you very much!

Navidfoumani commented 8 months ago

Hi,

Thank you for your message, and I'm glad you find our work insightful! Below is the code snippet to generate the graph for Figure 1 in our publication:

import matplotlib.pyplot as plt
import numpy as np
import torch
import math

max_len = 1001
x = np.arange(-500, 501)
plt.figure(figsize=[10, 7])
plot_style = zip([64, 128, 256, 512], [0.6, 0.7, 1, 1.5])

for d_model, style in plot_style:
    pe = torch.zeros(max_len, d_model)  # positional encoding
    position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1)
    div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000) / d_model))
    pe[:, 0::2] = torch.sin(position * div_term)
    pe[:, 1::2] = torch.cos(position * div_term)
    similarity = torch.matmul(pe, pe.transpose(1, 0))
    plt.plot(x, similarity[500] / d_model, linewidth=style)

plt.xlabel("K", size=20)
plt.ylabel("Dot Product (Similarity)", rotation=90, size=16)
plt.legend(['$d_{model}=64$', '$d_{model}=128$', '$d_{model}=256$', '$d_{model}=512$'], prop={'size': 14})
plt.savefig('Sim-sin_normal.eps')
plt.show()

Feel free to reach out if you have any further questions or need additional clarification!