When trying to run the controller using the README instructions I hit this issue when trying to run both on collab and runpod (pytorch template).
Traceback (most recent call last):
File "/workspace/miniconda3/envs/llava/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/workspace/miniconda3/envs/llava/lib/python3.10/runpy.py", line 110, in _get_module_details
__import__(pkg_name)
File "/workspace/BakLLaVA/llava/__init__.py", line 1, in <module>
from .model import LlavaLlamaForCausalLM
File "/workspace/BakLLaVA/llava/model/__init__.py", line 2, in <module>
from .language_model.llava_mistral import LlavaMistralForCausalLM, LlavaConfig
File "/workspace/BakLLaVA/llava/model/language_model/llava_mistral.py", line 22, in <module>
from transformers import AutoConfig, AutoModelForCausalLM, \
ImportError: cannot import name 'MistralConfig' from 'transformers' (/workspace/miniconda3/envs/llava/lib/python3.10/site-packages/transformers/__init__.py)
You can in the image above that the uninstall of the prior transformers version was performed and the lib was redownloaded with the editable llava install specified in the instructions.
Both environments were fresh and used just for this repo. I have been running the base LLaVA project today in both, so I know the platforms can potentially host them.
When trying to run the controller using the README instructions I hit this issue when trying to run both on collab and runpod (pytorch template).
You can in the image above that the uninstall of the prior transformers version was performed and the lib was redownloaded with the editable llava install specified in the instructions.
Both environments were fresh and used just for this repo. I have been running the base LLaVA project today in both, so I know the platforms can potentially host them.