Closed jeffkinnison closed 10 months ago
6 files ±0 6 suites ±0 14m 13s :stopwatch: -6s 12 tests ±0 9 :heavy_check_mark: ±0 3 :zzz: ±0 0 :x: ±0 60 runs ±0 42 :heavy_check_mark: ±0 18 :zzz: ±0 0 :x: ±0
Results for commit 768aaaf7. ± Comparison against base commit 89a032f6.
:recycle: This comment has been updated with latest results.
The LLM model type initializes the adapter weights and quantization at training time using
LLM.prepare_for_training
. WhenLLMEncoder
was added, ECD model did not have a correspondingprepare_for_training
method, so adapter initialization occurred at encoder initialization. This PR addsECD.prepare_for_training
, which bringsLLMEncoder
adapter initialization to parity with LLM models.