myshell-ai / AIlice

AIlice is a fully autonomous, general-purpose AI agent.
MIT License
650 stars 92 forks source link

Small features / corrections #36

Closed benda95280 closed 1 month ago

benda95280 commented 1 month ago

Dear,

Would it be possible to check :

Thanks !

stevenlu137 commented 1 month ago

I've fixed the issue in the Dockerfile and added the share option.

Regarding your first suggestion, I need more details. Were you encountering a lack of detailed error messages when loading something?

For the second suggestion, regardless of which model you run, AIlice will automatically create the config.json file if it doesn't exist. However, when using models that require an API key(and there is no key found in config.json), AIlice will prompt you for the API key and update it in config.json. This is an on-demand mechanism for updating config.json. If this mechanism does not meet your needs in certain scenarios, please let me know (I currently feel that this approach works quite well).

benda95280 commented 1 month ago

On first try, using command below do not create config file :

root@ailice:/home/benda/AIlice/AIlice/ailice# python3 AIliceWeb.py --modelID=lm-studio:Nexesenex/MIstral-QUantized-70b_Miqu-1-70b-iMat.GGUF --contextWindowRatio=0.4
config.json is located at /root/.config/ailice
Encountered an exception, AIlice is exiting: 'lm-studio'
  File "/home/benda/AIlice/AIlice/ailice/AIliceWeb.py", line 224, in main
    mainLoop(**kwargs)
  File "/home/benda/AIlice/AIlice/ailice/AIliceWeb.py", line 32, in mainLoop
    config.Initialize(modelID = modelID)
  File "/home/benda/AIlice/AIlice/ailice/common/AConfig.py", line 176, in Initialize
    needAPIKey = ("apikey" in self.models[modelType] and (self.models[modelType]["apikey"] is None))
                              ~~~~~~~~~~~^^^^^^^^^^^

While command below yes :

root@ailice:/home/benda/AIlice/AIlice/ailice# python3 AIliceWeb.py --modelID=oai:gpt-4o
config.json is located at /root/.config/ailice
config.json need to be updated.
********************** Initialize *****************************
Your oai api-key (press Enter if not):
********************** End of Initialization *****************************
benda95280 commented 1 month ago
stevenlu137 commented 1 month ago

The use case lm-studio:Nexesenex/MIstral-QUantized-70b_Miqu-1-70b-iMat.GGUF requires prior configuration of LMStudio and config.json for utilization. Although this is noted in the README, it may not be prominently displayed and could potentially mislead users. You can refer to Example 2 in "How to Add LLM Support" for configuring the model, after which it can be used. All other use case listed can be used directly.

As for creating the configuration file, you only need to run AIlice once normally, and it will generate the config.json file.

benda95280 commented 1 month ago

As for creating the configuration file, you only need to run AIlice once normally, and it will generate the config.json file.

Maybe my bad, but i was not able to found the "run normally" way for AIlice ?

stevenlu137 commented 1 month ago

You can run any testcase in the list directly except this one. the model:lm-studio:Nexesenex/MIstral-QUantized-70b_Miqu-1-70b-iMat.GGUF does not exist until you complete the configuration of LMStudio and config.json.

stevenlu137 commented 1 month ago

In other words, the model ID is defined in config.json. All other test cases in the testcase list use predefined model IDs, except for this one, which requires the user to configure and define it themselves.

stevenlu137 commented 1 month ago

I have just made some modifications. In the latest commit, it is ensured that config.json exists after the first run of AIlice, and an error message is provided when the specified model ID does not exist.

benda95280 commented 1 month ago

Awsome, thanks !