Closed adrm closed 1 year ago
Hey @adrm gpt-migrate should already have support for this through using litellm
@krrishdholakia Can you give any details on how to actually do this? (that is, what command line parameters + environment variables need to be set, any code needs to be modified, if any etc.)
Ok, found my problem. Turns out the version of litellm in requirements.txt does not support openrouter (too old)
pip install --upgrade litellm and it was able to recognize the openrouter model passed in (openrouter/openai/gpt-4-32k)
Added OpenRouter
It would be great to use other GPT-like API endpoints like OpenRouter. For example, this would allow anyone to use gpt-4-32k even if they have no access from their OpenAI accounts, since OpenAI is no longer giving access to gpt-4-32k for the time being, and this model is basically a requisite to use gpt-migrate.