huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.13k stars 27.05k forks source link

BitsAndBytesConfig Accepting Arbitrary Parameters without Throwing Error #29736

Closed benjaminye closed 8 months ago

benjaminye commented 8 months ago

System Info

Who can help?

@SunMarc @youn

Information

Tasks

Reproduction

  1. Module Imports

    from transformers import BitsAndBytesConfig, AutoModel
  2. Define BitsAndBytesConfig

    quant_config = BitsAndBytesConfig({"foo":"bar", "foo":"bar"})
    
    # also works with:
    # quant_config = BitsAndBytesConfig(**{"foo":"bar", "foo":"bar"})
    
    quant_config
    """
      >>>
      BitsAndBytesConfig {
      "_load_in_4bit": false,
      "_load_in_8bit": {
        "foo": "bar"
      },
      "bnb_4bit_compute_dtype": "float32",
      "bnb_4bit_quant_type": "fp4",
      "bnb_4bit_use_double_quant": false,
      "llm_int8_enable_fp32_cpu_offload": false,
      "llm_int8_has_fp16_weight": false,
      "llm_int8_skip_modules": null,
      "llm_int8_threshold": 6.0,
      "load_in_4bit": false,
      "load_in_8bit": {
        "foo": "bar"
      },
      "quant_method": "bitsandbytes"
    }
    """
  3. Load in Model with Quant Config

    AutoModel.from_pretrained("TinyLlama/TinyLlama-1.1B-Chat-v1.0", 
                               quantization_config=quant_config, 
                               device_map="auto")
  4. Model successfully loads without any error showing user that an invalid BitsAndBytesConfig was passed in

Expected behavior

amyeroberts commented 8 months ago

cc @younesbelkada

benjaminye commented 8 months ago

The easiest way to prevent this would be having explicit checks when performing post_init() when instantiating BitsAndBytesConfig:

https://github.com/huggingface/transformers/blob/1248f0925234f97da9eee98da2aa22f7b8dbeda1/src/transformers/utils/quantization_config.py#L303-L309

I don't think there's a need to check for arguments in case when they are passed in as kwargs, i.e. BitsAndBytesConfig(**{"foo":"bar", "foo":"bar"}). Maybe helpful warning would be nice?

younesbelkada commented 8 months ago

Hi @benjaminye thanks for raising the issue! yes adding a warning makes sense, would you like to open a PR for that? Otherwise happy to do it!

benjaminye commented 8 months ago

Hi @younesbelkada, PR raised @ https://github.com/huggingface/transformers/pull/29761

younesbelkada commented 8 months ago

Thanks so much !

benjaminye commented 8 months ago

Thanks everyone! This was a good first issue to work on 😄