Closed timjaggs closed 5 months ago
Hi @timjaggs,
First, this is a completely free workflow that I actively support and have poured hundreds of hours into. As you may be able to understand, I'm unwilling to invest time into supporting issues that are not written in respectful language. Please consider this moving forward.
Looking at the logs, this looks to be instability on OpenAI's side. If you have not already tried adding billing info, funding your account, and replacing your API key with a new one after adding that billing info, please try that.
Additionally, I do not believe your workflow is fully updated. See point 1 in the screenshot:
The latest version of the workflow should report a successful chunk operation in the logs before transcribing. Your logs don't have that part.
I'd recommend going into the editor, refreshing the page, and ensuring all steps are updated: https://thomasjfrank.com/how-to-transcribe-audio-to-text-with-chatgpt-and-notion/#update
If that doesn't work, then this is unfortunately just instability on OpenAI's side right now. As you can see in point 2, OpenAI returns a success response as well as rate-limit information that will show in the logs.
Your logs don't show that, and unfortunately OpenAI's API has a bad habit of returning generic connection errors that don't have a status code.
Another reason I believe your workflow isn't updated; the latest version should retry multiple times even when it encounters the generic error.
Thanks for the quick response.
As you will have seen in another email from me today, I have had to ask you for a refund of ultimate brain - not because of anything you have done or not done (your product looks fabtastic) - but because Notion won’t let me add Notion AI to my subscription because I subscribed through the App Store.
I don’t have time for organizations (I am talking abouty Notion, not you!) that won’t address simple but key customer issues that ought to be an easy fix.
I wish you all the best.
On Dec 20, 2023, at 11:04 AM, Thomas Frank @.***> wrote:
Hi @timjaggs https://github.com/timjaggs,
First, this is a completely free workflow that I actively support and have poured hundreds of hours into. As you may be able to understand, I'm unwilling to invest time into supporting issues that are not written in respectful language. Please consider this moving forward.
Looking at the logs, this looks to be instability on OpenAI's side. If you have not already tried adding billing info, funding your account, and replacing your API key with a new one after adding that billing info, please try that.
Additionally, I do not believe your workflow is fully updated. See point 1 in the screenshot:
@.*** (view on web) https://github.com/TomFrankly/pipedream-notion-voice-notes/assets/684961/ed4f4814-a6eb-4bd7-8086-078e7836f372 The latest version of the workflow should report a successful chunk operation in the logs before transcribing. Your logs don't have that part.
I'd recommend going into the editor, refreshing the page, and ensuring all steps are updated: https://thomasjfrank.com/how-to-transcribe-audio-to-text-with-chatgpt-and-notion/#update
If that doesn't work, then this is unfortunately just instability on OpenAI's side right now. As you can see in point 2, OpenAI returns a success response as well as rate-limit information that will show in the logs.
Your logs don't show that, and unfortunately OpenAI's API has a bad habit of returning generic connection errors that don't have a status code.
Another reason I believe your workflow isn't updated; the latest version should retry multiple times even when it encounters the generic error.
— Reply to this email directly, view it on GitHub https://github.com/TomFrankly/pipedream-notion-voice-notes/issues/55#issuecomment-1864992754, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACPUNWH7AM5PQ25SZULXEHTYKMZDDAVCNFSM6AAAAABA3556FSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRUHE4TENZVGQ. You are receiving this because you were mentioned.
Describe the bug A clear and concise description of what the bug is. I'm losing the will to live - this is so not a ten-minute job.
Error An error occured while attempting to split the file into chunks, or while sending the chunks to OpenAI. If the full error below says "Unidentified connection error", please double-check that you have entered valid billing info in your OpenAI account. Afterward, generate a new API key and enter it in the OpenAI app here in Pipedream. Then, try running the workflow again. If that does not work, please open an issue at this workflow's Github repo: https://github.com/TomFrankly/pipedream-notion-voice-notes/issues Full error from OpenAI: Connection error.
DETAILS at Object.chunkFileAndTranscribe (file:///tmp/pdg/dist/code/4cf355a52ab0f9c275ba953eea42492276c4b796f961fdffefa87942b1ced4df/code/Notion-Voice-Notes.mjs:391:11) at Object.run (file:///tmp/pdg/dist/code/4cf355a52ab0f9c275ba953eea42492276c4b796f961fdffefa87942b1ced4df/code/Notion-Voice-Notes.mjs:1901:22) at null.executeComponent (/var/task/launch_worker.js:267:22) at MessagePort.messageHandler (/var/task/launch_worker.js:764:28)
Which cloud storage app are you using? (Google Drive, Dropbox, or OneDrive)
Dropbox
Have you tried updating your workflow? Please follow the steps here, and ensure you've tested the latest version of the workflow: https://thomasjfrank.com/how-to-transcribe-audio-to-text-with-chatgpt-and-notion/#update
Yes
Does the issue only happen while testing the workflow, or does it happen during normal, automated runs?
No fucking idea - haven't got beyond failed tests
Please paste the contents of your Logs tab from the notion_voice_notes action step.
Checking that file is under 300mb... File size is approximately 0.1mb. File is under the size limit. Continuing... Checking if the user set languages... No language set. Whisper will attempt to detect the language. Downloaded file to tmp storage: { path: '/tmp/2035 ricardo dr 3.m4a', mime: '.m4a' } Successfully got duration: 9 seconds Chunking file: /tmp/2035 ricardo dr 3.m4a Transcribing file: chunk-000.m4a Attempting to clean up the /tmp/ directory... Cleaning up /tmp/chunks-2ZmSulzMjkBp5EOlJ54cXOX1oJX...