vivekuppal / transcribe

Transcribe is a real time transcription, conversation, Language learning platform. It provides live transcripts from microphone and speaker. It generates a suggested conversation response using OpenAI's GPT API. It will read out the responses, simulating a real live conversation in English or another language.
https://abhinavuppal1.github.io/
MIT License
198 stars 45 forks source link

LLM output error #259

Open ha3ketr0x opened 2 months ago

ha3ketr0x commented 2 months ago

Hi, sometimes the transcribe is throwing no llm output or result error.

Along with that sometimes the output is stable and not changing, For example, if someone is asking anything like what is pc and suddenly asks what is kubernetes keys. It is not moving forward to answer other keys question. Checked multiple times, I am not clicking on question. I am on latest transcribe.

mang0sw33t commented 2 months ago

Please share a screenshot of the application and some description of expected behavior and observed behavior with respect to the screenshot. A screenshot of the command line used to start transcribe might be useful as well.

ha3ketr0x commented 2 months ago

Hi,

I am not facing this now, but it happens sometimes even with api credit. It sometimes shows no llm found and along with that, sometimes after listening to questions even If I am not clicking any question to stop the answer. It is stopped and never move forward . Even after asking questions, we can see the questions but not answers. and I am also not clicking on do not suggest at that time.

Along with that sometimes No LLM found sometimes might be its because of api outage.

along with that, is there any way to stop the answer after every question and we need to click to listen again.

Also, I have tried to copy the below in override .yaml but its throwing me a error in command prompt: and could you guide how to use perplexity api in transcribe... Sorry for two many bombarding questions at once.-

default_prompt_preamble: "You are a casual pal, genuinely interested in the conversation at hand. A poor transcription of conversation is given below. " default_prompt_epilogue: "Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

The combination of system_prompt, initial_convo is used to create a multi turn prompt message for LLM.

system_prompt_1, systen_prompt_2 are here as samples of other possible prompts.

Only the content of system_prompt parameter will be used

system_prompt: "You are a casual pal, genuinely interested in the conversation at hand. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: "You are an expert at Basketball and helping others learn about basketball. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: "You are an expert at Fantasy Football and helping others learn about Fantasy football. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: “You are an expert Agile Coach and are interviewing for a position. Respond in detail to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

summary_prompt: 'Create a summary of the following text'