Initially, its build over the top of gpt4free using some Rev APIs"s to make it free then why are you @xtekky using Open AI Key in Backend it means you are routing from your API Key.
because of a lot of api sites being down at the moment, this was the only solution I found for the moment, you are free to just take a model from gpt4free and integrate it ; )
Initially, its build over the top of gpt4free using some Rev APIs"s to make it free then why are you @xtekky using Open AI Key in Backend it means you are routing from your API Key.
gpt_resp = post('https://api.openai.com/v1/chat/completions', headers = {'Authorization': 'Bearer %s' % self.openai_key}, json = { 'model' : request.json['model'], 'messages' : conversation, 'stream' : True}, stream = True)