lab-cosmo / nice

MIT License
12 stars 3 forks source link

scaling c_{k}_{lambd} #2

Open spozdn opened 3 years ago

spozdn commented 3 years ago

When importances are normalized, i. e. do not change with the uniform scaling of the separate lambda channels, c{k}{lambd} doesn't affect anything completely.

Including c{k}{lambd} in the iteration would increase computational cost.

c{k}{lambd}, where k is l2 is not symmetric with respect to the changing of the order of covariants to contract. Thus it would make single contraction function to loose its general form, where arguments are arbitrary covariants.

ceriottm commented 3 years ago

This is an interesting issue, but one that needs testing. Scaling diffferent lambda channels might affect model accuracy in real-life scenarios - I have anecdotal evidence that it does.