DaiShiResearch / TransNeXt

[CVPR 2024] Code release for TransNeXt model
Apache License 2.0
347 stars 15 forks source link

About attention_cuda.py and attention_native.py files #13

Open logicvanlyf opened 3 months ago

logicvanlyf commented 3 months ago

Hello, I would like to ask, what are the attention_cuda.py and attention_native.py files in the classification folder and are they modules? I would be very grateful if your team could answer.

DaiShiResearch commented 2 months ago

The files attention_cuda.py and attention_native.py in the classification folder contain the CUDA and native PyTorch implementations of the Aggregated Pixel-focused Attention, respectively. These implementations are completely equivalent in terms of functionality.

At the beginning of the model's code, there's a logic to determine which implementation to use:

if is_installed('swattention'):
    print('swattention package found, loading CUDA version of Aggregated Attention')
    from attention_cuda import AggregatedAttention
else:
    print('swattention package not found, loading PyTorch native version of Aggregated Attention')
    from attention_native import AggregatedAttention

This means that if the swattention package is installed, the CUDA version will be used. Otherwise, the native PyTorch version will be loaded.

I hope this helps!