danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, AWS, OpenAI, Assistants API, Azure, Groq, o1, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.45k stars 2.9k forks source link

Enhancement: AutoLogin & cookie extraction via Puppeteer script on optional Browserless docker container. #1492

Closed deoxykev closed 7 months ago

deoxykev commented 8 months ago

What features would you like to see added?

Using the Browserless docker image as an option, LibreChat could be configured with a username and pass for endpoints such as ChatGPT and Bing. It will cache the cookies and use them if valid. If it needs new cookies, a fresh cookie can be generated by simulating a login. This would be as simple sending a POST to the /function endpoint of the Browserless container, containing puppeteer code that would login in the code field, and the username and pass in the context field, wait until a cookie appears, then return the cookie.

Then save and load the cookie into LibreChat.

More details

I can write and maintain the playwright script / browserless integration if you would be interested in implementing the other parts.

Which components are impacted by your request?

Bing, ChatGPT providers

Pictures

Diagram

Code of Conduct

danny-avila commented 8 months ago

Thanks for your write up!

Seems tangentially related to #1133

@fuegovic can comment if it's worthwhile for ChatGPT, since I forget if the access token gets sent on every request or if the reverse proxy needs it loaded within its own service

I'm open to this, the tricky part is determining if it's expired. I don't use Bing enough to know if there's a specific error, or if all errors we would treat as an expired cookie

Using the Browserless docker image as an option

This would definitely need to be optional in the same way that mongo express is configured in the override compose file: https://docs.librechat.ai/features/manage_your_database.html

if you would be interested in implementing the other parts.

Can you give me more clarity on these parts?

deoxykev commented 8 months ago

Can you give me more clarity on these parts?

  1. Implement a function that retrieves the ChatGPT cookie from either in-memory variable (I would recommend not persisting on disk at all, seeing that cookies are ephemeral anyway), or from a POST request to browserlessdocker:8000/function that sends chatGPT username and password from config or env variable, and receives cookie.

You can check if the cookie is valid by sending a GET to https://chat.openai.com/api/auth/session.

This function would likely be injected here: https://github.com/danny-avila/LibreChat/blob/9864fc8700026b7c215c99d040131bfa3c6f72f9/api/app/chatgpt-browser.js#L16C15-L16C15

The POST request to the /function endpoint should contain the following payload:

{
  "code": "<Puppeteer code I will write>",
  "context": {
    "username": "<load from env or config>",
    "password": "<load from env or config>"
  }

Expect it to return:

{
  "data": "<fresh_chatgpt_cookie>",
  "type": "string"
}

OR

{
  "data": "ERROR_INVALID_CREDS",
  "type": "string"
}

  1. Something similar for Bing endpoint.

I think you may be able to check for a valid _U cookie like this: https://github.com/maximerenou/php-bing-ai/blob/main/src/BingAI.php#L37


  1. Some checks to see if the browserless endpoint is available and username/password is available from config/env. If not configured, ask user to input cookie in manually instead.