This repository contains the source code and dataset link mentioned in WWW 2022 accepted paper "TRACE:A Fast Transformer-based General-Purpose LosslessCompressor".
24
stars
4
forks
source link
it looks different from the standard self-attention mechanism #2
are you apply self-attention in 'numerator_and_denominator.py'? that seems puzzling! could you eplain that?