Open NikhilMank opened 5 months ago
I'm also interested in logging multiple loss values
cc @muellerzr @SunMarc
@dkrystki @amyeroberts I have solved this issue. I have created a custom trainer class inheriting the 'class Trainer' and then defined a custom_loss where I have logged the values using self.log()
Now, I wanted to log the weights of the model too but I am unable to do it in the same file, I can do it by creating a different tensorboard events file but I am unable to do it in the same events file like the multiple losses. @muellerz @SunMarc @amyeroberts
I'm training RT-DETR and follow your instruction. But when open wandb only training loss appear.
@hoangdangthien try ' wandb.log', I used self.log for tensorboard
Feature request
Able to log individual losses returned as dict.
Motivation
I have multiple losses that are being added to form a combined loss. I want to log all these individual losses to observe the trend of the individual losses. SemanticSegmenterOutput accepts single loss at the moment and logs the loss in the SemanticSegmenterOutput.
Your contribution
I have modified the Trainer class and SemanticSegmenter output as below but it is not working as expected. I have added a few print statements to check if the on_log part is being accessed or not but that code is not even being accessed.
class CustomTrainer(Trainer): def init(self, *args, *kwargs): super().init(args, **kwargs) self.additional_losses = {}
@dataclass class CustomSemanticSegmenterOutput(SemanticSegmenterOutput): additional_losses: Optional[Dict[str, torch.FloatTensor]] = None