@achab the answer take some time , I think regardless of the response time of OpenAI, it is due to /anonymize endpoint ; in the client side we communicate twice with our server we send pdf file and wait for the anonymized txt to br returned to send it again to /ask endpoint. Maybe we can delete anonymize and merge the functionality in /ask ?
@achab the answer take some time , I think regardless of the response time of OpenAI, it is due to /anonymize endpoint ; in the client side we communicate twice with our server we send pdf file and wait for the anonymized txt to br returned to send it again to /ask endpoint. Maybe we can delete anonymize and merge the functionality in /ask ?