AImageLab-zip / alveolar_canal

This repository contains the material from the paper "Improving Segmentation of the Inferior Alveolar Nerve through Deep Label Propagation"
33 stars 5 forks source link

The length setting issue of #13

Closed lgh010319 closed 5 days ago

lgh010319 commented 6 days ago

Hello author, I'm sorry to bother you. I would like to ask about the length of mem_1en. I saw that your code mem_1en=128. Is this setting related to each patch_sthape? What specifically refers to the length of a memory unit and what is stored in the memory unit? If my patch_sthape=(98, 98, 98), what should be the mem_1en set to

LucaLumetti commented 5 days ago

Hello,

The memory length is not restricted by the patch shape; it can be any value in $$[0, +\infty)$$, regardless of the patch shape you select. The memory consists of a set of memory_length tokens, which are concatenated with the tokens from the patch before being passed into the transformer module. These tokens are learned through the standard backpropagation algorithm and are discarded afterward. The shape of a memory token is the same as the shape of a patch token, what exactly do they learn is really hard to tell but they should store some general representation that is then useful for the transformer. If you didn't, I suggest you to read the IEEE Access paper and the paper that proposed the Memory Transformer

If you have any other questions or you need further explanation, do not hesitate to reach out again.

Luca