huggingface / optimum

🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
https://huggingface.co/docs/optimum/main/
Apache License 2.0
2.57k stars 469 forks source link

open_clip model onnx export not supported #1450

Open alachyan-ml opened 1 year ago

alachyan-ml commented 1 year ago

System Info

Python 3.9.16
optimum 1.13.2
Ubuntu 20.04.6 LTS

Who can help?

@michaelbenayoun

Information

Tasks

Reproduction (minimal, reproducible, runnable)


Traceback (most recent call last):
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/lib/python3.9/site-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/lib/python3.9/site-packages/optimum/commands/export/onnx.py", line 232, in run
    main_export(
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/lib/python3.9/site-packages/optimum/exporters/onnx/__main__.py", line 323, in main_export
    model = TasksManager.get_model_from_task(
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/lib/python3.9/site-packages/optimum/exporters/tasks.py", line 1683, in get_model_from_task
    model_class = TasksManager.get_model_class_for_task(
  File "/raid/lingo/alachyan/anaconda3/envs/fan_trt/lib/python3.9/site-packages/optimum/exporters/tasks.py", line 1148, in get_model_class_for_task
    tasks_to_model_loader = TasksManager._LIBRARY_TO_TASKS_TO_MODEL_LOADER_MAP[library]
KeyError: 'open_clip' 

### Expected behavior

Getting key_error for exporting onnx model based on the open_clip library. Seems that the `TasksManager._LIBRARY_TO_TASKS_TO_MODEL_LOADER_MAP` map does not have open_clip integrated. 

Open AI CLIP exporting working as expected without any problems. 

` % optimum-cli export onnx -m openai/clip-vit-base-patch32 --framework pt ./clip.onnx`
fxmarty commented 1 year ago

Hi @alachyan-ml, thank you for the report!

I run

optimum-cli export onnx -m openai/clip-vit-base-patch32 --framework pt clip_onnx

with

torch==2.1.0
optimum==1.13.2
transformers==4.34.0

and can not reproduce this issue. Could you give more details about the issue you are facing?

isaac-chung commented 1 year ago

@fxmarty the model in your example is from OpenAI's clip, but not open_clip. I ran

optimum-cli export onnx -m laion/CLIP-ViT-B-32-laion2B-s34B-b79K --framework pt clip_onnx

with

optimum==1.13.2
torch==2.0.0
transformers==4.34.0

and got the same error message as @alachyan-ml did.

fxmarty commented 1 year ago

Hi @isaac-chung, I used the model id openai/clip-vit-base-patch32 shared by @alachyan-ml: https://github.com/huggingface/optimum/issues/1450#issue-1943412047

Let me have a look at laion/CLIP-ViT-B-32-laion2B-s34B-b79K and get back to you.

isaac-chung commented 1 year ago

@fxmarty it doesn't seem like @alachyan-ml actually provided the command they ran, as they only mentioned that the open AI one works with no issue, which we confirmed 👌

Open AI CLIP exporting working as expected without any problems. % optimum-cli export onnx -m openai/clip-vit-base-patch32 --framework pt ./clip.onnx

Thanks for taking a look! I started poking around and tried adding the following, still WIP with no success yet: in class TasksManager:

+        _OPEN_CLIP_TASKS_TO_MODEL_LOADERS = {
+            "zero-shot-image-classification": "create_model_and_transforms",
+        }

        _LIBRARY_TO_TASKS_TO_MODEL_LOADER_MAP = {
            "transformers": _TRANSFORMERS_TASKS_TO_MODEL_LOADERS,
            "diffusers": _DIFFUSERS_TASKS_TO_MODEL_LOADERS,
            "timm": _TIMM_TASKS_TO_MODEL_LOADERS,
+            "open_clip": _OPEN_CLIP_TASKS_TO_MODEL_LOADERS,
        }

In TasksManager.get_model_from_task:

+        if library_name == "open_clip":
+            model, _, _ = model_class(f"hf-hub:{model_name_or_path}", cache_dir=cache_dir)
+            TasksManager.standardize_model_attributes(
+                model_name_or_path, model, subfolder, revision, cache_dir, library_name
+            )
+            return model

Here is where I'm at currently:

(v1) isaacchung@Isaacs-MBP optimum % optimum-cli export onnx -m laion/CLIP-ViT-B-32-laion2B-s34B-b79K --framework pt clip_onnx
Traceback (most recent call last):
  File "/Users/isaacchung/virtualenv/v1/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/Users/isaacchung/work/optimum/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/Users/isaacchung/work/optimum/optimum/commands/export/onnx.py", line 239, in run
    main_export(
  File "/Users/isaacchung/work/optimum/optimum/exporters/onnx/__main__.py", line 354, in main_export
    model_type = "stable-diffusion" if is_stable_diffusion else model.config.model_type.replace("_", "-")
  File "/Users/isaacchung/virtualenv/v1/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1614, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'CLIP' object has no attribute 'config'
isaac-chung commented 1 year ago

I could gather these into a PR and maybe we could continue the discussion for the fix there?

fxmarty commented 1 year ago

For sure, thank you @isaac-chung!

alachyan-ml commented 1 year ago

Hi, sorry was away for a few days, but @isaac-chung is correct. I forgot to include the command that yielded the error but the laion/CLIP-ViT-B-32-laion2B-s34B-b79K model is the one I was trying to export. Took a look at the thread and seems like it is being looked at. Thanks!

nemphys commented 10 months ago

Hi, any news on this one? I was trying to convert the same model (CLIP-ViT-B-32-laion2B-s34B-b79K) with the exact same error.

sushilkhadkaanon commented 2 months ago

Hi @alachyan-ml , @nemphys , @fxmarty , @isaac-chung , Does the code changes work for you guys?

alachyan-ml commented 1 month ago

Hey, picking this up again let me take a look.