Closed jhasenmyer closed 6 months ago
Hi @jhasenmyer. Thankyou for your kind words.
You are correct, when deploying Deep Chat, we recommend you to handle the connection in a proxy server to protect your key. Whilst this is usually pretty trivial, the OpenAI Assistants API is a little bit more complicated as it requires multiple calls to different endpoints to manage threads.
I have recently helped another user set this up on a NextJs server where I have described most of the required code. Here is the issue. Hopefully this helps you set it up on your backend.
Let me know if you need any further assistance. Thanks!
I agree and also love this feature!!! @OvidijusParsiunas I am using next.js for my framework so will I be able to fix the security issue with the keys with the link you provided here?
Oh and also thank you so much for this. It's amazing.
Yep, it should have everything you need. If you have any questions let me know (or perhaps comment them on that thread to keep it all in one place).
To note, I am currently away from home so I apologise if some of my responses are slow.
I believe that will be just what I need, thank you for pointing me to it!
Hi @jhasenmyer. I will be closing this issue since it is already being covered in this issue. Please feel free to comment any further queries there or create a new issue for anything else. Thankyou!
I absolutely LOVE the way the OpenAI assistant functionality work in deep chat! Unfortunately, while it's an amazing tool to demo, we can't deploy the app with the API Key in the UI like that. I was hoping to just handle that through server middleware that could add the key on the server and shuffle the requests and responses for the UI. I can't find a way to do that in the docs, so I am currently experimenting with the "request" attribute, and handling all the functionality server side. It would sure be handy to just give a custom URL that could add the appropriate headers...
Is that supported, or possible?