logix-project / logix

AI Logging for Interpretability and Explainability🔬
Apache License 2.0
74 stars 6 forks source link

add lora when resuming #60

Closed hwijeen closed 9 months ago

hwijeen commented 9 months ago

I was trying to initlaize_from_log with BERT model and found that we need to add lora modules before loading the saved lora state dict. We run into this error with the current code in the main branch.

Traceback (most recent call last):
  File "/data/tir/projects/tir6/general/hahn2/analog/examples/bert_influence/compute_influence.py", line 106, in <module>
    analog.initialize_from_log()
  File "/data/tir/projects/tir6/general/hahn2/analog/analog/analog.py", line 385, in initialize_from_log
    assert name in self.model.state_dict(), f"{name} not in model!"
AssertionError: model.bert.encoder.layer.0.attention.self.query.analog_lora_A.weight not in model!
sangkeun00 commented 9 months ago

This makes things simpler! However, to exclude the case where users perform add_lora separately, can you add some flag that checks whether lora is added or not, and perform add_lora based on it?