FederatedAI / FATE

An Industrial Grade Federated Learning Framework
Apache License 2.0
5.67k stars 1.55k forks source link

FATE-LLM-pellm #5522

Closed dongdongzhaoUP closed 2 months ago

dongdongzhaoUP commented 7 months ago

求问,在FATE-LLM中的pellm中,将base_model替换为添加了adapter(如lora等)的peft_model,在客户端训练后、聚合时仅传递给aggregator adapter部分的参数、不传递base_model参数是在哪里实现的?

mgqa34 commented 7 months ago

请问用的底层fate是哪个版本?

dongdongzhaoUP commented 7 months ago

谢谢回复。我看的是2.0.0,您也可以用您熟悉的版本说明

mgqa34 commented 7 months ago

谢谢回复。我看的是2.0.0,您也可以用您熟悉的版本说明

https://github.com/FederatedAI/FATE/blob/master/python/fate/ml/aggregator/base.py#L152 可以看下这里,聚合的是的trainable parameters(requires_grad=True)