yfzhang114 / LLaVA-Align

This is the official repo for Debiasing Large Visual Language Models, including a Post-Hoc debias method and Visual Debias Decoding strategy.
Apache License 2.0
71 stars 2 forks source link

'llava' is already used by a Transformers config, pick another name #5

Closed luning1217 closed 5 months ago

luning1217 commented 5 months ago

When I try to run it in LLaVA-1.5, I found this question:" 'llava' is already used by a Transformers config, pick another name."

yfzhang114 commented 5 months ago

To address the issue, please ensure that there are no repeated instances of a dictionary or configuration named "llava" in your codebase. If there are, consider renaming or removing them to avoid conflicts with the Transformers configuration.

Additionally, to provide further assistance, could you please provide more detailed information about the error you encountered? Specifically, we would appreciate it if you could share:

The specific error message you received. The context in which the error occurred (e.g., during model training, inference, etc.). Any relevant code snippets or configuration files related to the error.

With this information, we'll be better equipped to help you troubleshoot and resolve the issue.

luning1217 commented 5 months ago

In my code import json import os import torch import sys from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig, pipeline from peft import PeftModel, PeftConfig import math from llava.model.builder import load_pretrained_model from llava.mm_utils import get_model_name_from_path from llava.eval.run_llava import eval_model from vcd_utils.vcd_sample import evolve_vcd_sampling evolve_vcd_sampling() import warnings from transformers import logging and the error is

`ValueError Traceback (most recent call last) Cell In [3], line 15 13 from llava.eval.run_llava import eval_model 14 import warnings ---> 15 from vcd_utils.vcd_sample import evolve_vcd_sampling 16 evolve_vcd_sampling()

File ~/autodl-tmp/LLaVA/vcd_utils/vcd_sample.py:23 21 from transformers.generation.utils import SampleOutput 22 from transformers.generation import SampleEncoderDecoderOutput, SampleDecoderOnlyOutput ---> 23 from experiments.llava.constants import IMAGE_TOKEN_INDEX 25 def sample( 26 self, 27 input_ids: torch.LongTensor, (...) 41 ) -> Union[SampleOutput, torch.LongTensor]: 42 # init values 43 logits_processor = logits_processor if logits_processor is not None else LogitsProcessorList()

File ~/autodl-tmp/LLaVA/experiments/llava/init.py:1 ----> 1 from .model import LlavaLlamaForCausalLM

File ~/autodl-tmp/LLaVA/experiments/llava/model/init.py:1 ----> 1 from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig 2 from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig

File ~/autodl-tmp/LLaVA/experiments/llava/model/language_model/llava_llama.py:199 188 model_inputs.update( 189 { 190 "past_key_values": past_key_values, (...) 195 } 196 ) 197 return model_inputs --> 199 AutoConfig.register("llava", LlavaConfig) 200 AutoModelForCausalLM.register(LlavaConfig, LlavaLlamaForCausalLM)

File ~/miniconda3/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py:1153, in AutoConfig.register(model_type, config, exist_ok) 1147 if issubclass(config, PretrainedConfig) and config.model_type != model_type: 1148 raise ValueError( 1149 "The config you are passing has a model_type attribute that is not consistent with the model type " 1150 f"you passed (config has {config.model_type} and you passed {model_type}. Fix one of those so they " 1151 "match!" 1152 ) -> 1153 CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)

File ~/miniconda3/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py:846, in _LazyConfigMapping.register(self, key, value, exist_ok) 842 """ 843 Register a new configuration in this mapping. 844 """ 845 if key in self._mapping.keys() and not exist_ok: --> 846 raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") 847 self._extra_content[key] = value

ValueError: 'llava' is already used by a Transformers config, pick another name.`

luning1217 commented 5 months ago

Is it an error caused by my introduction of the llava package? But it seems that LLaVA cannot be used without the introduction of the llava package.

yfzhang114 commented 5 months ago

We already have a lava dictionary for evaluation here, you do not need to install the llava package

luning1217 commented 5 months ago

Thanks

luning1217 commented 5 months ago

But I have a question, if I want to use it directly for LLaVA inference, how should I do it?

yfzhang114 commented 5 months ago

If you wish to utilize our method directly for LLaVA inference, you have the flexibility to do so. You can obtain the final prediction results from your model and then apply our calibration methods independently. This approach allows you to integrate our debiasing calibration techniques seamlessly into your existing workflow.

However, I must highlight a consideration regarding the implementation. Our VDD algorithm involves revisions to the sampling process. Due to these modifications, we recommend against using the installed package directly within your current environment. Instead, we suggest creating a new environment as outlined in our documentation. This approach ensures that the calibration process operates smoothly and avoids any potential conflicts with your existing setup.