henk717 / KoboldAI

KoboldAI is generative AI software optimized for fictional use, but capable of much more!
http://koboldai.com
GNU Affero General Public License v3.0
372 stars 135 forks source link

Loading directly from HF can cause 0_Layers keyerror #514

Open mrseeker opened 7 months ago

mrseeker commented 7 months ago
2024-04-04T18:30:52.181902749Z > File "aiserver.py", line 743, in g
2024-04-04T18:30:52.181906734Z     return f(*a, **k)
2024-04-04T18:30:52.181912276Z            │  │    └ {}
2024-04-04T18:30:52.181916181Z            │  └ ({'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from...
2024-04-04T18:30:52.181920306Z            └ <function UI_2_load_model at 0x7f458b321550>
2024-04-04T18:30:52.181924278Z 
2024-04-04T18:30:52.181928128Z   File "aiserver.py", line 6341, in UI_2_load_model
2024-04-04T18:30:52.181933894Z     model_backends[data['plugin']].set_input_parameters(data)
2024-04-04T18:30:52.181939234Z     │              │                                    └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.181944954Z     │              └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.181949115Z     └ {'KoboldAI API': <modeling.inference_models.api.class.model_backend object at 0x7f44cfefa880>, 'KoboldAI Old Colab Method': <...
2024-04-04T18:30:52.181953315Z 
2024-04-04T18:30:52.181957121Z   File "/opt/koboldai/modeling/inference_models/generic_hf_torch/class.py", line 81, in set_input_parameters
2024-04-04T18:30:52.181961195Z     super().set_input_parameters(parameters)
2024-04-04T18:30:52.181965165Z                                  └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.181969154Z 
2024-04-04T18:30:52.181973134Z   File "/opt/koboldai/modeling/inference_models/hf_torch.py", line 124, in set_input_parameters
2024-04-04T18:30:52.181977114Z     ret = super().set_input_parameters(parameters)
2024-04-04T18:30:52.181981194Z                                        └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.181985248Z 
2024-04-04T18:30:52.181989018Z   File "/opt/koboldai/modeling/inference_models/hf.py", line 174, in set_input_parameters
2024-04-04T18:30:52.181993821Z     if isinstance(parameters["{}_Layers".format(i)], str) and parameters["{}_Layers".format(i)].isnumeric():
2024-04-04T18:30:52.181999888Z                   │                             │             │                             └ 0
2024-04-04T18:30:52.182004985Z                   │                             │             └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.182013068Z                   │                             └ 0
2024-04-04T18:30:52.182017300Z                   └ {'custom_model_name': 'aurora-m/aurora-m-biden-harris-redteamed', 'class': 'model', 'label': 'Load custom Pytorch model from ...
2024-04-04T18:30:52.182021385Z 
2024-04-04T18:30:52.182025210Z KeyError: '0_Layers'

When loading a model using the "Load Custom model from Huggingface" and ignoring the fact that the slider must appear, KoboldAI crashes with this error. Workaround is to load the model during initialisation.

Expected result would be that when the config is loaded, the UI should not allow the model to be loaded unless the parameter settings are valid.