getao / icae

The repo for In-context Autoencoder
Creative Commons Zero v1.0 Universal
82 stars 4 forks source link

the decoder of ICAE has lora parameters #7

Closed ZongqianLi closed 3 weeks ago

ZongqianLi commented 7 months ago

in 157 lline of llama_icae_modeling.py: decoder_outputs = self.icae(inputs_embeds=decoder_input_embeddings, output_hidden_states=True) here, icae has lora parameters, because it has been added lora parameters in line 114: self.icae = get_peft_model(self.icae, lora_config) and has been enabled in line 148: compress_outputs = self.icae(inputs_embeds=autoencoder_input_embedding, output_hidden_states=True, enable_lora=True) however, the decoder of icae should be the same as llama-2-7b and should not have lora parameters is this a bug?

getao commented 6 months ago

We modify the forward of the model to use an argument enable_lora to control whether to enable the lora part.

In the latest v2 release, we use the official peft's disable_adapter to control the behavior.