zkkli / RepQ-ViT

[ICCV 2023] RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers
Apache License 2.0
102 stars 8 forks source link

How to understand the concept of channel before layernorm? #6

Closed xyhe1996 closed 7 months ago

xyhe1996 commented 7 months ago

Hi~ the concept of channel-wise in Conv is quite intuitive, but how to understand channel in tensor with shape [batch_size, num_seq, num_hidden] ? weather the channel refers to the num_seq dim or not?

zkkli commented 7 months ago

Hi, The channel dimension refers to the last dimension of the tensor, i.e., num_hidden.

xyhe1996 commented 7 months ago

Hi, The channel dimension refers to the last dimension of the tensor, i.e., num_hidden.

appreciate your quick reply. so take a tensor with shape (1, 197, 768) as an example,the x-axis of the boxplot figure in paper should have 768 xticks for 0~767 channel, the y-axis represents the value in each channel, and there are 197 values within each box for a specific channel. am i right? if i miss something, pls let me know.

zkkli commented 7 months ago

Yeah, you're right.

xyhe1996 commented 7 months ago

thx again for your help ~