Closed hage1005 closed 8 months ago
This design was intentional. PCA vs random only matters for initializing LoRA weights. However, when you call initialize_from_log
, LoRA initialization is directly handled by saved state_dict
. Therefore, as long as we ensure that the rank structure is the same, it doesn't matter whether self.init_strategy
is pca
or random
. Please take a look at the below code snippet for the detail.
Yeah it not affecting the functionality and correctness. I was concerned that it might introduce potential issues for the future.
In
lora.py
we set init_strategy = "random"` if covariance state is not provided. This should be avoided if it's called from initialize_from_log(), which might use PCA but doesn't have covariance.