Open sean89503 opened 1 year ago
First of all you could set the developement
environment for logging so you could see which contents are sent to the openai;
Second the project is just send the title of the article to the openai like below: https://github.com/reply2future/xExtension-NewsAssistant/blob/b1e2d2f4626db7c8047cb507c177a4363c2d2956/Controllers/assistantController.php#L100-L110
I would like to add the new configuration to select which content to send.
BTW, would you like to capture the UI of the extension configuration?
I have added the field configuration so you could select title
or content
to the openai;
I have made the error response of calling openai api more friendly and display the error message to the html page.
Thank you! that helps
Do you think we can config it to point to a local install of LLaMA?
Do you think we can config it to point to a local install of LLaMA?
Okay, we need to standardize the LLaMA's invocation API interface. In other words, we want LLaMA to be able to make HTTP calls and configure the request address in this plugin. The plugin will only handle the requests and not handle the encapsulation of the interface.
There are some project doing this thing:
Do you think we can config it to point to a local install of LLaMA?
This is a good idea!
Related to this topic so posting here. A summary is generated but it is incomplete:
Related to this topic so posting here. A summary is generated but it is incomplete:
i have everything setup correct and i can see the api is being used via openIA's portal but nothing is summarized