sotopia-lab / sotopia

Sotopia: an Open-ended Social Learning Environment (ICLR 2024 spotlight)
https://docs.sotopia.world
MIT License
148 stars 19 forks source link

Fixed litellm context length #53

Closed ProKil closed 5 months ago

ProKil commented 5 months ago

📑 Description

32 introduces a problem: ChatLiteLLM requires an non-optional max_tokens argument, which is hard to calculate for different models, esp. models outside OpenAI. <- this I think is how the design decision is made in LangChain. In the PR, I made a PatchedLiteLLM which inherits LiteLLM, but makes max_tokens optional.

Note: This has not been extensively tested for all models supported by LiteLLM. Errors may appear when you are using new models. Please open an issue when you encounter one.

✅ Checks

ℹ Additional Information