polyrabbit / hacker-news-digest

:newspaper: Let ChatGPT Summarize Hacker News for You
http://hackernews.betacat.io/
GNU Lesser General Public License v3.0
668 stars 87 forks source link

adding support for anthropic, azure, cohere, llama2 #26

Open krrishdholakia opened 11 months ago

krrishdholakia commented 11 months ago

Hi @polyrabbit ,

Noticed you're only calling OpenAI. I'm working on litellm (simple library to standardize LLM API Calls - https://github.com/BerriAI/litellm) and was wondering if we could be helpful.

Added support for Claude, Cohere, Azure and Llama2 (via Replicate) by replacing the ChatOpenAI completion call with a litellm completion call. The code is pretty similar to the OpenAI class - as litellm follows the same pattern as the openai-python sdk.

Would love to know if this helps.

Happy to add additional tests / update documentation, if the initial PR looks good to you.

polyrabbit commented 11 months ago

Hi, thanks for this wonderful library.

Just one quick question - does it support function calling for other models, or even just OpanAI models? This app relies on JSON response.

krrishdholakia commented 11 months ago

yes it support function calling - exactly like how openai calls it - https://litellm.readthedocs.io/en/latest/input/

polyrabbit commented 11 months ago

Nice! I'll try it later, thanks

polyrabbit commented 11 months ago

One difference I found is on the way to set timeout - OpenAI uses timeout parameter whereas litellm uses force_timeout, is it intended?

Could you please also add litellm as a dependency to the requirements.txt file?

krrishdholakia commented 10 months ago

Hey @polyrabbit i updated the requirements.txt.

re:timeout - i thought that was for the completions endpoint - i don't recall seeing a timeout parameter for ChatCompletions - if you could share any relevant documentation, happy to check it out.

Let me know if there are any remaining blockers for this PR

polyrabbit commented 10 months ago

I see it here: https://github.com/openai/openai-python/blob/b82a3f7e4c462a8a10fa445193301a3cefef9a4a/openai/api_resources/chat_completion.py#L21-L28

def create(cls, *args, **kwargs):
    """
    Creates a new chat completion for the provided messages and parameters.

    See https://platform.openai.com/docs/api-reference/chat-completions/create
    for a list of valid parameters.
    """
    start = time.time()
    timeout = kwargs.pop("timeout", None)

So timeout is used in my code, after switching to litellm, the code throws exception: unexpected keyword argument 'timeout'

krrishdholakia commented 10 months ago

got it - will make a fix for it and update the PR