Eclectic-Sheep / sheeprl

Distributed Reinforcement Learning accelerated by Lightning Fabric
https://eclecticsheep.ai
Apache License 2.0
305 stars 31 forks source link

[Question] Is there a way to merge checkpoints? #194

Closed Disastorm closed 8 months ago

Disastorm commented 8 months ago

Just wondering if there is a way to merge checkpoints. Like for example if i have a model that can play part of a level well and then a model that can play a different part well, is there a way to merge them so that the final model can play both parts decently?

belerico commented 8 months ago

Hi @Disastorm, there is an interesting article here: they were talking about LLM merging, but I think that the same idea can be applied here

Disastorm commented 8 months ago

I see thanks