Engine code for an OpenAI assistant that likes to talk about AI, written using Typescript, Node.js, & the Azure stack. GPT-3 backed with a store of AI Canon documents.
Change the architecture so the user asks a question. One version of the question goes straight to the LLM, another is surrounded with ‘Write an X word summary of an article that would be the best answer to the following question. Consider enriching the question with additional topics that the question asker might want to understand . Write the summary in the present tense, as though the article exists. If the question is not related to building AI applications, deflect it with a humorous 10 word answer.
Change the architecture so the user asks a question. One version of the question goes straight to the LLM, another is surrounded with ‘Write an X word summary of an article that would be the best answer to the following question. Consider enriching the question with additional topics that the question asker might want to understand . Write the summary in the present tense, as though the article exists. If the question is not related to building AI applications, deflect it with a humorous 10 word answer.
Use Promise.all to run both queries in parallel. https://stackoverflow.com/questions/35612428/call-async-await-functions-in-parallel