Open Reva-h opened 3 months ago
Thanks for raising this issue! Yes, I agree that this is not very convenient and that it would be great to allow users to specify their preferred models (beyond the set listed in our codebase). I'll add a to-do for this and hope to have this fixed soon!
Thanks for such a great tool!
Our OpenAI organization only has access to gpt-3.5-turbo-1106 and gpt-3.5-turbo-0125, and I ran into some issues with getting LLooM to run. I had to modify the dicts declared in llm.py (RATE_LIMITS, CONTEXT_WINDOW, etc) to add our preferred models as keys. Either gpt-3.5-turbo should resolve to whatever version is available with API key, or user should be allowed to specify GPT 3.5 version upon lloom object initiaization via the constructor.