Firstly; Loving this project - I can definitely see this making a team a lot more efficient, while also making sure that the Readme doesn't end up being dead documentation that's never up to date.
A feature request is to make the configuration more extendable. Personally I am using the azure openai which has a different endpoint for conversations. Furthermore it looks like this is running on the gpt-3.5-turbo model and I have the gpt-4 model available so would love to be able to switch and see if it did a better or worse job.
Hi!
Firstly; Loving this project - I can definitely see this making a team a lot more efficient, while also making sure that the Readme doesn't end up being dead documentation that's never up to date.
A feature request is to make the configuration more extendable. Personally I am using the azure openai which has a different endpoint for conversations. Furthermore it looks like this is running on the gpt-3.5-turbo model and I have the gpt-4 model available so would love to be able to switch and see if it did a better or worse job.