Open cssychen opened 1 week ago
In the fusion mode: The addition of a new role does not necessitate additional training. However, it requires providing a new role's configuration file (getting feature representation) for matching with the existing LoRA blocks.
In the expansion mode, as you articulated, this applies to scenarios where new roles have sufficient annotated data.
Hello, I found your article very beneficial, but I don't quite understand Section 3.3 and would like to ask for some clarification.
In the fusion mode, if a new role is added, is retraining required? If so, which parameters need to be updated?
In the expansion mode, does it mean adding a new gate and a new LoRA AB on the existing LoRA?