Closed aclifford91 closed 1 year ago
Actually, it's a lot faster today, I think it was latency at OpenAI
will keep an eye on this but it hasn't been as bad as when i opened it
After yesterdays testing, where hopii was very performant, I think this is almost entirely on OpenAI and their response times. Closing for now, but we'll keep considering alternatives for quicker response times.
I think its probably the new prompt length we are sending to gpt 4. I've noticed that with 4 as well through chatgpt.