InftyAI / llmlite

๐ŸŒต A library helps to communicate with all kinds of LLMs consistently.
MIT License
12 stars 2 forks source link

Support ChatGPT #26

Closed kerthcet closed 1 year ago

kerthcet commented 1 year ago

fix https://github.com/InftyAI/llmlite/issues/6

kerthcet commented 1 year ago

@CodiumAI-Agent /review

CodiumAI-Agent commented 1 year ago

PR Analysis

How to use

Tag me in a comment '@CodiumAI-Agent' and add one of the following commands: /review [-i]: Request a review of your Pull Request. For an incremental review, which only considers changes since the last review, include the '-i' option. /describe: Modify the PR title and description based on the contents of the PR. /improve [--extended]: Suggest improvements to the code in the PR. Extended mode employs several calls, and provides a more thorough feedback. /ask \<QUESTION>: Pose a question about the PR. /update_changelog: Update the changelog based on the PR's contents.

To edit any configuration parameter from configuration.toml, add --config_path=new_value For example: /review --pr_reviewer.extra_instructions="focus on the file: ..." To list the possible configuration parameters, use the /config command.

kerthcet commented 1 year ago

@CodiumAI-Agent /improve

kerthcet commented 1 year ago

@CodiumAI-Agent /ask Can you help to confirm whether this project support chatgpt or not?

CodiumAI-Agent commented 1 year ago

Question: Can you help to confirm whether this project support chatgpt or not?

Answer: Yes, this project does support ChatGPT. The support for ChatGPT is implemented in the ChatGPTChat class in the llmlite/llms/chatgpt.py file. The class uses the OpenAI API to interact with the ChatGPT model. The OPENAI_API_KEY and OPENAI_ENDPOINT are fetched from the environment variables to authenticate the API requests. The completion method in this class is used to generate responses from the ChatGPT model.

The ChatLLM class in llmlite/apis/chatllm.py is a high-level interface that uses the ChatGPTChat class when the model_name_or_path parameter is set to "gpt-3.5-turbo". This allows users to interact with the ChatGPT model in a consistent way with other models.

In the tests/integration_tests/test_chat.py file, there are integration tests for the ChatGPT model, which further confirms the support for ChatGPT in this project.

kerthcet commented 1 year ago

@CodiumAI-Agent /describe

kerthcet commented 1 year ago

@CodiumAI-Agent /update_changelog