Closed Hanawh closed 3 years ago
Hi, no the lambda is used to balance the two terms in the rate distortion function: the estimated bit-rate and the distortion. The lower the lambda, the more the distortion is penalized over the rate term, which will result in a higher compression rate but a lower reconstruction quality. You may need to search different lambdas to achieve different bit-rates (for example on a logarithmic scale).
Closing this for now. Please let us know if you encounter any bug or issue.
hello , can you share the diffenent lambda value in quilty 1......8
Can the diffenent lambda values in quilty 1......8 be provided in compressai/losses/__init__.py
? such as
from compressai.losses import q2lambda, RateDistortionLoss
loss_function = RateDistortionLoss(q2lambda(q))
And can users be allowed to define the output type of RateDistortionLoss? such as
from typing import Literal
class RateDistortionLoss(nn.Module):
def __init__(self, lmbda: float=1e-2, return_type: Literal["bpp_loss", "mse_loss", "loss", "all"]="all"):
def forward(self, output, target):
if self.return_type == "all":
return out
else:
return out[return_type]
And now RateDistortionLoss only support MSE. Can it support ms-ssim?
Now compressai model's output is a dict like {"x_hat": torch.Tensor, "likelihoods": XXX}
. However many loss function expect only torch.Tensor (i.e, x_hat). Can it provide a decorator to wrap these functions?
Sorry to bother you again. I have another question. I want to know what the value of lambda in loss function depends on. If the picture I use is 112x112, do I need to adjust the lambda in config, or do I need to search for the lambda value