All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
37.37k stars 4.23k forks source link

fix: remove repeated completion assignment in llm.py #5167

Closed WannaTen closed 3 days ago

WannaTen commented 4 days ago

End-user friendly description of the problem this fixes or functionality that this introduces


Give a summary of what the PR does, explaining any non-trivial design decisions

This duplicate code is introduced here: https://github.com/All-Hands-AI/OpenHands/pull/4722/files#diff-b46ec9dd4b319d75809520e34b2559fd2ce46ca4895709eae9312aa04a8fa9a4L151, I don't know what its purpose is, so send a pr to delete.


Link of any specific issues this addresses

WannaTen commented 4 days ago

@rbren Hi rb, can you have a look on this pr?

WannaTen commented 4 days ago

@enyst I reordered init_model_info and partial, make self.max_output_token assigned before using it in partial.

Besides, I wondering whether we need to assign a default number (i.e. 4096) to max_output_token in init_model_info() if it is none . In litellm completion, if max_token is none, it means infinity output tokens. But in openhands, if user does not set a token number, and if litellm does not set a default max_token in model_info neither, openhands just set it to 4096, this breaks the hidden rules: user does not care, litellm does not care, but oh care. if we really care, why we set it to None in core/llm_config.py, it doesn't make sense.

enyst commented 4 days ago

Besides, I wondering whether we need to assign a default number (i.e. 4096) to max_output_token in init_model_info() if it is none . In litellm completion, if max_token is none, it means infinity output tokens. But in openhands, if user does not set a token number, and if litellm does not set a default max_token in model_info neither, openhands just set it to 4096, this breaks the hidden rules: user does not care, litellm does not care, but oh care. if we really care, why we set it to None in core/llm_config.py, it doesn't make sense.

That's a good question. At the time of setting the value in the dataclass (None) we don't know if liteLLM will give us something... but I do agree it's weird.

I'd suggest it doesn't need to be part of this PR, but if you want to propose a PR changing that, we can look into it.

WannaTen commented 3 days ago

That's a good question. At the time of setting the value in the dataclass (None) we don't know if liteLLM will give us something... but I do agree it's weird.

I'd suggest it doesn't need to be part of this PR, but if you want to propose a PR changing that, we can look into it.

Thanks a lot, I will send another PR to discuss parameter settings in LLM configure