Open brunoalbertopeixoto opened 2 months ago
Hi @brunoalbertopeixoto to clarify, there is no monthly fee to use this plugin. However, you do have to pay OpenAI for the GPT API usage on a pay-per-use basis.
Having said that, I do agree it would be nice to support other LLMs especially open-sourced ones.
I am not a programmer, but I have been able to tinker around with some projects and use opensource LLM. Most opensource LLM frameworks are compatible with Open AI API calls. I believe that if you allow the base URL to be modified it would easily allow people to use their own models (through ollama, LMstudio, or other cloud ones)
Hi @Ehesh, you can change the API Base URL under Zotero Preferences > Aria > Model Configuration. If an open-source LLM uses the same API signature as OpenAI's, it might work. Good luck and let me know!
Hi @Ehesh, you can change the API Base URL under Zotero Preferences > Aria > Model Configuration. If an open-source LLM uses the same API signature as OpenAI's, it might work. Good luck and let me know!
Create a tutorial how to do this please. I bought the openia api but I would like to work with an opensource alternative without limits. I would like to work in Brazilian Portuguese.
I am not a programmer, but I have been able to tinker around with some projects and use opensource LLM. Most opensource LLM frameworks are compatible with Open AI API calls. I believe that if you allow the base URL to be modified it would easily allow people to use their own models (through ollama, LMstudio, or other cloud ones)
Create a tutorial how to do this please. I bought the openia api but I would like to work with an opensource alternative without limits. I would like to work in Brazilian Portuguese.
The system is currently optimized for the OpenAI GPT models. I'd like to see broader model support, especially open sourced ones. However, I really don't have the time to work on it. Community effort is welcome 🤗
I have an openai compatible api endpoint , i get this error
{ "name": "Error", "message": "input values have 2 keys, you must specify an input key or pass only 1 key as input", "stack": "getInputValue@resource://gre/modules/addons/XPIProvider.jsm -> jar:file://extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file:/ /extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:53279:11\nsaveContext@resource://gre/modules/addons/XPIProvider.jsm -> jar:file:/ /extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file:/ /extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:53398:43\ninvoke@resource://gre/modules/addons/XPIProvider.jsm -> jar:file:/ /extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file:/ /extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:20784:17\n" }
i don't need an api key so simply pass random what does this message mean ? "message": "input values have 2 keys, you must specify an input key or pass only 1 key as input", my api does'nt show any error INFO: 127.0.0.1:51295 - "POST /v1/chat/completions HTTP/1.1" 200 OK Consumed 0.0$ INFO: 127.0.0.1:51295 - "POST /v1/chat/completions HTTP/1.1" 200 OK
Hi @dcmumby , I suspect this error is not related to your API end point. Do you always get the same error even with different questions?
same error regardless of question
I leave as a suggestion the possibility of adding opensource and free Illmas APIs, such as Gemini, as alternatives. Paying 20 dollars a month is unfeasible.