You're missing at BatchNorm right at the start of the network, and also the residual should be added before applying ReLU in the final layer of each ResNet block, not after applying ReLU.
In my experiments, adding the above two changes seems to generate results that better match the original results.
Just to add a bit more to this, I don't think the BatchNorm right at the start is necessary if the data is z-normalised, but adding the residual before applying ReLU definitely improves performance.
I've noticed some differences to the original ResNet code at https://github.com/cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline/blob/master/ResNet.py.
You're missing at BatchNorm right at the start of the network, and also the residual should be added before applying ReLU in the final layer of each ResNet block, not after applying ReLU.
In my experiments, adding the above two changes seems to generate results that better match the original results.