Closed PlanetMacro closed 1 month ago
Considering your configuration, the appropriate modelID should be: ollama:ollama/llama2:latest
ok, that makes sense. However it gives another error:
Encountered an exception, AIlice is exiting: 'formatter' File "/home/user/AIlice/ailice/AIliceMain.py", line 126, in main mainLoop(**kwargs) File "/home/user/AIlice/ailice/AIliceMain.py", line 91, in mainLoop llmPool.Init([modelID]) File "/home/user/AIlice/ailice/core/llm/ALLMPool.py", line 21, in Init self.pool[id] = MODEL_WRAPPER_MAP[config.models[modelType]["modelWrapper"]](modelType=modelType, modelName=modelName) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/AIlice/ailice/core/llm/AModelChatGPT.py", line 17, in init self.formatter = CreateFormatter(modelCfg["formatter"], tokenizer = self.tokenizer, systemAsUser = modelCfg['systemAsUser'])
It seems that there was a misleading error in my documentation (strange why such an error would occur). The correct configuration requires adding a line: "ollama/llama2:latest": { "formatter": "AFormatterGPT", "contextWindow": 8192, "systemAsUser": false }
I tried to include ollama model into /home/user/.config/ailice/config.json as explained in the readme:
the config.json looks like this: