Loading HuggingFace model from meta-llama/Llama-2-7b-chat
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Users\nakersha\AppData\Local\miniconda3\envs\multilora\Scripts\olive.exe\__main__.py", line 7, in <module>
File "C:\Users\nakersha\AppData\Local\miniconda3\envs\multilora\Lib\site-packages\olive\cli\launcher.py", line 60, in main
service.run()
File "C:\Users\nakersha\AppData\Local\miniconda3\envs\multilora\Lib\site-packages\olive\cli\auto_opt.py", line 147, in run
run_config = self.get_run_config(tempdir)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nakersha\AppData\Local\miniconda3\envs\multilora\Lib\site-packages\olive\cli\auto_opt.py", line 174, in get_run_config
to_replace.append(("passes", self._get_passes_config(config["passes"], olive_config)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nakersha\AppData\Local\miniconda3\envs\multilora\Lib\site-packages\olive\cli\auto_opt.py", line 258, in _get_passes_config
(("model_builder", "metadata_only"), self.args.input_model["type"].lower() == "onnxmodel"),
^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'Namespace' object has no attribute 'input_model'. Did you mean: 'input_cols'?
Describe the bug Olive cli command errors out with --provider option
To Reproduce
olive auto-opt -m meta-llama/Llama-2-7b-chat --adapter_path wsvn53/Llama-2-7b-chat-lora-tricky_math -o models\Llama-2-7b-chat-LoRA --use_model_builder --provider DmlExecutionProvider
Expected behavior The model would be generated
Olive config No olive config
Olive logs
Other information
olive-ai 0.7.0 onnx 1.17.0 onnxruntime 1.21.0.dev20241030004 onnxruntime-genai 0.5.0rc1
Additional context Add any other context about the problem here.