ai-shifu / ChatALL

Concurrently chat with ChatGPT, Bing Chat, Bard, Alpaca, Vicuna, Claude, ChatGLM, MOSS, 讯飞星火, 文心一言 and more, discover the best answers
https://chatall.ai
Apache License 2.0
15.2k stars 1.64k forks source link

[FEAT] Background API option in settings #115

Closed Explosion-Scratch closed 1 year ago

Explosion-Scratch commented 1 year ago

Is your feature request related to a problem? Please describe. Currently it is hard to utilize the APIs for many AI services (e.g. Bard, Bing, Claude) in developed projects, especially via their unofficial APIs. Having a background service in this app which hosts a local server would be a good solution.

Describe the solution you'd like A background service (optional opt-in) that you can enable in settings to query AIs.

Describe alternatives you've considered Making my own service for this, but you guys have already implemented standardized request frameworks for each of the AIs.

sunner commented 1 year ago

This feature is not intended for exposing to end-users. However, I have a similar approach to yours.

I want to create abstract APIs that can handle all the LLMs' official or unofficial APIs. This would be a separate library project.

Are you working on this? Where can I find the repo?

Explosion-Scratch commented 1 year ago

Are you working on this? Where can I find the repo?

Not yet, I would be but I've been pretty busy lately, would be happy to try to start a prototype today though

Explosion-Scratch commented 1 year ago

I'm getting this error while installing with pnpm i:

 ERROR  Failed to compile with 1 error                                                                                                                                8:26:02 AM

 error  in ./src/components/Messages/ChatMessage.vue?vue&type=script&setup=true&lang=js

Module not found: Error: Can't resolve 'highlight.js/styles/github.css' in '/Users/tjs/Documents/.coding/repos/ChatALL/src/components/Messages'

 ERROR  Error: Build failed with errors.
Error: Build failed with errors.

So I switched to yarn and it took way longer to install but worked after I switched my node version. Now I'm getting this error:

Module not found: Error: Default condition should be last one

It seems that this was an issue with transitive-bullshit/chatgpt-api but I don't see you using that package anywhere in the app, nor a way to upgrade it.

The full error is this:

➜  ChatALL git:(main) ✗ yarn serve  
yarn run v1.22.19
$ vue-cli-service serve
 INFO  Starting development server...

 ERROR  Failed to compile with 1 error                              8:33:12 AM

 error  in ./src/main.js

Module not found: Error: Default condition should be last one

ERROR in ./src/main.js 12:0-40
Module not found: Error: Default condition should be last one

webpack compiled with 1 error
sunner commented 1 year ago

I never tried yarn or pnpm on this project. However, I met the same Module not found error today. It is caused by vuetify 3.3.x . Have you upgraded it? ChatALL is using 3.2.x

Explosion-Scratch commented 1 year ago

I never tried yarn or pnpm on this project. However, I met the same Module not found error today. It is caused by vuetify 3.3.x . Have you upgraded it? ChatALL is using 3.2.x

Ah, that's why. A new install would install the latest version below 4. The package.json currently says "vuetify": "^3.2.3", meaning that any version 3.x.x would satisfy this. To limit it to 3.2.x you'd want ~3.2.3 (3.2.3 → 3.2.x)

Explosion-Scratch commented 1 year ago

I tried creating a server.js file, but it won't let me import any of the bots. It gives this error:

 error  in ./src/bots/openai/ChatGPT35Bot.js

Module parse failed: Unexpected token (4:20)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| 
| export default class ChatGPT35Bot extends ChatGPTBot {
>   static _className = "ChatGPT35Bot"; // Class name of the bot
|   static _logoFilename = "chatgpt-35-logo.png"; // Place it in assets/bots/
|   static _model = "text-davinci-002-render-sha";

 @ ./src/server.js 1:0-54
 @ ./src/background.js
 @ multi ./src/background.js

Which appears to be related to babel, but when using babel-preset-env in addition to the current vue babel preset it breaks other things.

sunner commented 1 year ago

Are you trying to do something like LangChain? If so, maybe contribute to it is better. Actually, I'm planning to call all API through it.

transcendr commented 1 year ago

I'm interested in this as well. My use case is to rent a low-cost windows server machine on paperspace with a public IP (like $10/mo) and let the application run. Expose an API route that will start a request via the application to whatever model and then stream the bot's response back as the response. With the EU on the verge of banning API access to generative models it may be my only choice to maintain some sort of personal API access, is to create my own by routing requests through a web interface. It seems chatAll has most of the required functionality. Does chatall already handle streaming or do you have to wait for a full response until the text appears?

sunner commented 1 year ago

ChatALL does not provide API. It is not good to access the web APIs through auto script but not manual input.

transcendr commented 1 year ago

@sunner It's a bit silly to say it's not "good" to expose an API route to these services. I think these services probably consider what ChatAll is already doing as not "good" because they want you to of course use the service through their own interface, to serve you ads, and so on. Really, what you mean is that there's a risk of you being banned from the service if you were to be detected, such as if you were making too many requests. Obviously, a throttle would be ideal to avoid that, but other than that there's is literally no difference between manually entering a prompt and having it loaded as a result of an API request.

sunner commented 1 year ago

They can't ban ChatALL, they can ban user's account since the behavior is an abuse.

transcendr commented 1 year ago

@sunner I don't believe anyone said they could ban chatall. chatall is simply a client for making requests to the service. BTW, I'm sure that chatall is already considered by all of these services an abuse of the service and there's a risk of the user's account being banned just by using chatall. For example, Bing requires the Edge browser. You are clearly bypassing that, probably by spoofing the browser agent, which I'm sure is ALREADY against their TOS. Regardless, what I was saying though, is that it's rather silly to make a distinction between a user entering a query manually into chatall and the query being transmitted from some API post body. In both cases, you are literally forwarding the form values to the respective services. As long as there is some protection against the rapid submission of requests (eg, a new request cannot be submitted while the bot is already "busy") then there's technicaly no difference. If someone wants to remove that limit then they do so at their own risk. Please, consider reopening this as your reasons for closing it are inadequate.

sunner commented 1 year ago

@sunner I don't believe anyone said they could ban chatall. chatall is simply a client for making requests to the service. BTW, I'm sure that chatall is already considered by all of these services an abuse of the service and there's a risk of the user's account being banned just by using chatall.

Yes, using ChatALL is abuse. But if ChatALL provide API which can be called by script, the risk of banning account will increase.