Closed Mikilio closed 1 month ago
The i18nStrings var is used to pass to the worker the i18n strings for the error, they are always set that way at these lines: https://github.com/micz/ThunderAI/blob/9366fc581bbf21e098adc5d99d0e029a14c0f5fb/api_webchat/controller.js#L85-L87
When the prompt is not inserted, there is an error in the log?
No error in the log. Only unrelated warnings.
If I post a version here with more logging, can you try it?
Yeah, sure, I'll just have to replace the source link. I'm using your stuff for free, it's the least I could do.
You can find here version 2.2.0_i143_v1.
Please follow this steps:
I've added some console.log statement to check what's happening. Since the chat is an html page, and Hyprland had problems with the new html menu, may you try to replicate the bug also in another windows manager, like KDE?
If you need the source code, you can find it in the branch: https://github.com/micz/ThunderAI/tree/issue_143 (full diff)
Thank you.
I'm sorry I can't even test it because on that version I seem to have the issue #137
There seems to be an exception at:
close (resource://gre/modules/ConduitsChild.sys.mjs#143)
unload (resource://gre/modules/ExtensionCommon.sys.mjs#1019)
unload (resource://gre/modules/ExtensionPageChild.sys.mjs#281)
unload (resource://gre/modules/ExtensionPageChild.sys.mjs#324)
destroyExtensionContext (resource://gre/modules/ExtensionPageChild.sys.mjs#496)
observe (resource://gre/modules/ExtensionPageChild.sys.mjs#397)
So before that is solved, I can't test for the actual error of this issue. I'd like to test for KDE, but that is quite a lot more effort for me currently, so it will have to wait.
EDIT: for some reason the window is responsive and just fails to render, I was able to reproduce the error of this issue and the log is:
(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17
I haven't made any changes to this version regarding the dynamic menu.
Even with a blank menu, you can trigger a prompt using CTRL+ALT+A to open the menu. Then, press a number and hit Enter. Are you able to launch a prompt this way and replicate the error?
Edit for some reason the window is responsive and just fails to render, I was able to reproduce the error of this issue and the log is:
Do you see this message:
Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"...
on the chat? But the prompt is not added afterwards and no error is displayed?
yesss That is what is happening. I see this message posted by role: "Information"
Also regarding #137 it happens about 25% of the time.
yesss That is what is happening. I see this message posted by role: "Information"
Please keep the debug option active.
Do you see this message "[Ollama API] Connection succeded!" in the log?
Also regarding #137 it happens about 25% of the time.
I think I can't really do anything about it. It seems to be something between Hyprland and Thunderbird.
I have all options active. That particular message did not happen even once. Not even on seemingly successful runs.
Do you see this message "[Ollama API] Connection succeded!" in the log?
In fact I do not even see any messages from Ollama API
I get that message everytime.
Please try this one, I added more logs: thunderai-v2.2.0_i143_v2.zip
Unfortunately, exactly the same. Because of the hash, I can verify I am indeed using a different version, but it's the exact same behavior.
You don't see even a message starting with ">>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id"? Or "Ollama API window opening..."?
(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17 >>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17 >>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17 >>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17
This is the exact log with all options enabled that I receive after choosing a promp; nothing else. I can then close the window and no new logs get added.
Would it help to give you my thunderbird settings?
In the last version I fixed the line
(>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
Removing the first parenthesis. It seems that you're not using the new version.
In the Thunderbird addon manager, in the ThunderAI details, may you check the version number you're using?
The parenthesis got removed, I just quoted my previous answer. Sorry for that. I am really using the new Version. On nix, I have to specify a hash for each installation. So I know that this one is a different version. Obviously, I can't verify if it's the one you intended to send me.
>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17
this is the actual freshly copied log
Are those the only lines in the log for ThunderAI? With the debug option enabled, there should be many more lines. The new lines I added are direct console.log statements, so I don't understand how you're seeing those lines you posted, but not the others.
There is nothing before "Ollama init done"?
at least during the bug. there are other unrelated lines:
1727646358636 addons.xpi WARN Checking /nix/store/m5bcq19qmci0py4fzm3j1az7y5s6bax8-thunderbird-128.1.1esr/lib/thunderbird/distribution/extensions for addons
1727646359732 addons.xpi WARN Addon with ID thunderbird-compact-light@mozilla.org already installed, older version will be disabled
1727646359733 addons.xpi WARN Addon with ID thunderbird-compact-dark@mozilla.org already installed, older version will be disabled
(intermediate value).getAttribute is not a function ExtensionParent.sys.mjs:331:38
1727646360189 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646360204 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
sendRemoveListener on closed conduit languagetool-mailextension@languagetool.org.274877906944 3 ConduitsChild.sys.mjs:122:13
Layout was forced before the page was fully loaded. If stylesheets are not yet loaded this may cause a flash of unstyled content. msgHdrView.js:4214:7
Key event not available on some keyboard layouts: key=“a” modifiers=“accel,alt” id=“” messenger.xhtml
>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17
This is the log I get from loading the addon to the Ollama response, with debug enabled.
1727646722672 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646722688 addons.xpi WARN Addon with ID thunderai@micz.it already installed, older version will be disabled
sendRemoveListener on closed conduit thunderai@micz.it.274877906944 [ConduitsChild.sys.mjs:122:13](resource://gre/modules/ConduitsChild.sys.mjs)
1727646722725 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646722747 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
1727646723026 addons.webextension.thunderai@micz.it WARN Loading extension 'thunderai@micz.it': Reading manifest: Warning processing version: version must be a version string consisting of at most 4 integers of at most 9 digits without leading zeros, and separated with dots
[ThunderAI Logger | mzta-options] Options restoring chatgpt_win_height = 800 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring chatgpt_win_width = 700 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring default_sign_name = undefined [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring default_chatgpt_lang = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring reply_type = undefined [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring dynamic_menu_order_alphabet = true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring dynamic_menu_force_enter = false [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring connection_type = ollama_api [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring ollama_host = http://127.0.0.1:11434/ [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring ollama_model = tinyllama:latest [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_host = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_model = [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring openai_comp_chat_name = OpenAI Comp [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-options] Options restoring do_debug = true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Shortcut [Ctrl+Alt+A] registered successfully! [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_classify [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_reply [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_rewrite_formal [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_rewrite_polite [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_summarize_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta_Menus] addMenu: prompt_translate_this [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] Preparing data to load the popup menu: true [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] _prompts_data: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_rewrite_formal","label":"Rewrite formal","type":"2"},{"id":"prompt_rewrite_polite","label":"Rewrite polite","type":"2"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_this","label":"Prompt this","type":"2"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] active_prompts: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] tabType: mail [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-popup] filteredData: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Executing shortcut, promptId: prompt_classify [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] [ThunderAI] Prompt length: 300 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Ollama API window opening... [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] [ollama_api] prefs.chatgpt_win_width: 700, prefs.chatgpt_win_height: 800 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 60 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 4 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama init done. [controller.js:96:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434/" and model "tinyllama:latest"... [messagesArea.js:146:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. [controller.js:102:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] >>>>>> mailMessageId3: 1 [mzta-logger.js:35:44](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id: 4 [mzta-background.js:389:29](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/mzta-background.js)
>>>>>>>>>>>>> controller.js onMessage: {"command":"api_send","prompt":"Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. \"This is the body of test email number 4. It was sent.\" ","action":"0","tabId":1,"mailMessageId":1,"do_custom_text":"0"} [controller.js:145:13](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. "This is the body of test email number 4. It was sent." [messagesArea.js:146:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://c61d64d0-d32f-4256-9ed9-0facaef24c14/api_webchat/messagesArea.js)
[ThunderAI Logger | mzta-background] [Ollama API] Connection succeded!
I tried setting a breakpoint at one of the log points and can confirm that it's not being touched
I'm sorry I don't know how it can happen.
Leave it open, I'll solve this eventually.
So basically this line sometimes fails with: TypeError: can't access dead object
After following it more, the stack trace is:
get principal (resource://gre/modules/ExtensionPageChild.sys.mjs#254)
jsonStringify (resource://gre/modules/ExtensionCommon.sys.mjs#786)
sanitize (resource://gre/modules/ExtensionStorage.sys.mjs#167)
sanitize (chrome://extensions/content/child/ext-storage.js#186)
get (chrome://extensions/content/child/ext-storage.js#319)
callAsyncFunction (resource://gre/modules/ExtensionCommon.sys.mjs#1196)
callAsyncFunction (resource://gre/modules/ExtensionChild.sys.mjs#726)
callAndLog (resource://gre/modules/ExtensionChild.sys.mjs#706)
callAsyncFunction (resource://gre/modules/ExtensionChild.sys.mjs#725)
stub (resource://gre/modules/Schemas.sys.mjs#2954)
<anonymous> (moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/popup/mzta-popup.js#26)
<anonymous> (moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/popup/mzta-popup.js#25)
This happened because contetWindow is null.
Can you teach me how I can build the release. I never made an add-on before. I want to fork this code to actually fix this.
It seems that I had to turn on the debug option in the preferences of the extension. I'm sorry for misunderstanding if that is what you meant to tell me. Just turning on the debugging made the error from my last message not occur. The logs you asked for are:
[ThunderAI Logger | mzta-background] Shortcut triggered! 2 mzta-logger.js:35:44
[ThunderAI Logger | mzta-popup] Preparing data to load the popup menu: true mzta-logger.js:35:44
[ThunderAI Logger | mzta-popup] _prompts_data: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_rewrite_formal","label":"Rewrite formal","type":"2"},{"id":"prompt_rewrite_polite","label":"Rewrite polite","type":"2"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_this","label":"Prompt this","type":"2"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] mzta-logger.js:35:44
[ThunderAI Logger | mzta-popup] active_prompts: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] mzta-logger.js:35:44
[ThunderAI Logger | mzta-popup] tabType: mail mzta-logger.js:35:44
[ThunderAI Logger | mzta-popup] filteredData: [{"id":"prompt_classify","label":"Classify","type":"0"},{"id":"prompt_reply","label":"Reply to this","type":"1"},{"id":"prompt_summarize_this","label":"Summarize this","type":"0"},{"id":"prompt_translate_this","label":"Translate this","type":"0"}] mzta-logger.js:35:44
[ThunderAI Logger | mzta-background] Executing shortcut, promptId: prompt_classify mzta-logger.js:35:44
[ThunderAI Logger | mzta-background] [ThunderAI] Prompt length: 887 mzta-logger.js:35:44
[ThunderAI Logger | mzta-background] Ollama API window opening... mzta-logger.js:35:44
[ThunderAI Logger | mzta-background] [ollama_api] prefs.chatgpt_win_width: 0, prefs.chatgpt_win_height: 0 mzta-logger.js:35:44
>>>>>> [ThunderAI] Ollama init done. controller.js:96:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:latest"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:102:17
This is all very weird tbh. I don't get messages from mzta-options or from addMenu
here is a detailed stack trace of the error happening in this issue:
reportException resource://devtools/shared/ThreadSafeDevToolsUtils.js:82
evaluateJSAsync resource://devtools/server/actors/webconsole.js:865
makeInfallible resource://devtools/shared/ThreadSafeDevToolsUtils.js:103
enter resource://devtools/server/actors/utils/event-loop.js:82
_pauseAndRespond resource://devtools/server/actors/thread.js:981
pauseAndRespond resource://devtools/server/actors/thread.js:1174
_makeOnStep resource://devtools/server/actors/thread.js:1109
createStepForReactionTracking resource://devtools/server/actors/thread.js:97
addStub resource://gre/modules/Schemas.sys.mjs:3063
<anonymous> moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/api_webchat/controller.js:138
_doSend resource://gre/modules/ConduitsChild.sys.mjs:64
_send resource://gre/modules/ConduitsParent.sys.mjs:303
reply resource://gre/modules/ExtensionParent.sys.mjs:1195
recvAPICall resource://gre/modules/ExtensionParent.sys.mjs:1239
(Async: promise callback)
recvAPICall resource://gre/modules/ExtensionParent.sys.mjs:1229
_recv resource://gre/modules/ConduitsChild.sys.mjs:90
receiveMessage resource://gre/modules/ConduitsParent.sys.mjs:470
(Async: JSActor query)
_doSend resource://gre/modules/ConduitsChild.sys.mjs:64
_send resource://gre/modules/ConduitsChild.sys.mjs:125
callParentAsyncFunction resource://gre/modules/ExtensionChild.sys.mjs:900
get chrome://extensions/content/child/ext-storage.js:320
callAsyncFunction resource://gre/modules/ExtensionCommon.sys.mjs:1196
callAsyncFunction resource://gre/modules/ExtensionChild.sys.mjs:726
callAndLog resource://gre/modules/ExtensionChild.sys.mjs:706
callAsyncFunction resource://gre/modules/ExtensionChild.sys.mjs:725
stub resource://gre/modules/Schemas.sys.mjs:2954
<anonymous> moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/api_webchat/controller.js:88
_doSend resource://gre/modules/ConduitsChild.sys.mjs:64
_send resource://gre/modules/ConduitsParent.sys.mjs:303
reply resource://gre/modules/ExtensionParent.sys.mjs:1195
recvAPICall resource://gre/modules/ExtensionParent.sys.mjs:1239
(Async: promise callback)
recvAPICall resource://gre/modules/ExtensionParent.sys.mjs:1229
_recv resource://gre/modules/ConduitsChild.sys.mjs:90
receiveMessage resource://gre/modules/ConduitsParent.sys.mjs:470
(Async: JSActor query)
_doSend resource://gre/modules/ConduitsChild.sys.mjs:64
_send resource://gre/modules/ConduitsChild.sys.mjs:125
callParentAsyncFunction resource://gre/modules/ExtensionChild.sys.mjs:900
callAsyncFunction resource://gre/modules/ExtensionChild.sys.mjs:628
stub resource://gre/modules/Schemas.sys.mjs:2954
<anonymous> moz-extension://2e1bd1c1-6a49-4880-aca9-98165ed7c968/api_webchat/controller.js:41
Can you teach me how I can build the release. I never made an add-on before. I want to fork this code to actually fix this.
There is nothing to build, just follow these steps:
Now the addon has been loaded, when you change a file, just click on the "Reload" button in the page you've loaded the addon the first time.
This evening, I'll add a few try-catch blocks to the lines that caused errors and release a new version. It's really strange—it seems there's an issue with loading the preferences.
I've added a couple of try-catch blocks and console.error statements (full diff). thunderai-v2.2.0_i143_v3.zip
I am having the same issue.
Ubuntu 24.04.1 LTS TB 115.15.0 (installed from the mozillateam PPA - https://launchpad.net/~mozillateam/+archive/ubuntu/ppa)
I loaded the above referenced thunderai-v2.2.0_i143_v3.zip. Here are the full logs from TB start to a "classify" (CTRL-ALT-A, then 1) operation.
console-export-2024-9-30_22-17-2.txt
Note the same last 4 log lines as discussed above.
>>>>>> [ThunderAI] Ollama init done. controller.js:107:17
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:1b"... messagesArea.js:146:17
>>>>>> [ThunderAI] appendUserMessage done. messagesArea.js:164:17
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. controller.js:113:17
The Attempting to connect to the Ollama Local Server using the host "http://127.0.0.1:11434" and model "llama3.2:1b"
message is displayed. However the done
is not, and no prompt is ever displayed.
However, I can chat to the AI model - I say "Hello" and it says "Hello" back.
@mattcaron Does it happen all the time? Do you experience this issue with ThunderAI 2.1.5 as well? Which window manager are you using in Ubuntu? Thank you.
Yes, it is 100% repeatable. Yes, it happens with 2.1.5. I came here to file a big, found this, and installed the one from this thread. I'm using XFwm. It's standard Xubuntu.
Thanks.
I installed Xubuntu on a virtual machine, but I wasn't able to reproduce the error. I tried a new approach, could you try this version? thunderai-v2.2.0_i143_v4.zip
@mattcaron do you have other add-ons intalled?
@micz I do.
https://github.com/jobisoft/DAV-4-TbSync/ https://github.com/jobisoft/TbSync
Just to avoid any rabbit trails - you say you weren't able to reproduce this on a Xubuntu VM, but I'd like to note some data points:
Mozilla PPA instructions:
sudo add-apt-repository ppa:mozillateam/ppa
echo '
Package: thunderbird*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 1000
' | sudo tee /etc/apt/preferences.d/thunderbird
echo '
Package: firefox*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 1000
' | sudo tee /etc/apt/preferences.d/firefox
sudo apt install firefox thunderbird
I have loaded your new version and the behavior is the same. I have included the log below.
Just for giggles, I talked to it a bit and included those logs.
console-export-2024-10-2_12-57-22.txt
And here is the screenshot:
I've installed Thunderbird the way you did and evertything works. This is the "correct" log:
>>>>>>>> [ThunderAI] ollama_api sending I'm ready message... [controller.js:50:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] ollama_api I'm ready message sent. [controller.js:56:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] ollama_api worker initialized. [controller.js:58:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 177 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 14 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama init done. [controller.js:116:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Attempting to connect to the Ollama Local Server using the host "http://localhost:11434/" and model "tinyllama:latest"... [messagesArea.js:146:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] messagesArea.appendUserMessage done. [controller.js:122:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
[ThunderAI Logger | mzta-background] >>>>>> mailMessageId3: 2 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
>>>>>> [ThunderAI] Ollama API about to send message to createdTab3.id: 14 [mzta-background.js:389:29](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/mzta-background.js)
>>>>>>>>>>>>> controller.js onMessage: {"command":"api_send","prompt":"Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. \"To spice up your inbox with colors and themes, check out the Themes tab under Settings. Customize Gmail » Enjoy! - The Gmail Team Please note that Themes are not available if you're using Internet Explorer 6.0. To take advantage of the latest Gmail features, please upgrade to a fully supported browser.\" ","action":"0","tabId":1,"mailMessageId":2,"do_custom_text":"0"} [controller.js:167:13](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/controller.js)
>>>>>> [ThunderAI] appendUserMessage: Classify the following text in terms of Politeness, Warmth, Formality, Assertiveness, Offensiveness giving a percentage for each category. Reply with only the category and score with no extra comments or other text. Reply in the same language. "To spice up your inbox with colors and themes, check out the Themes tab under Settings. Customize Gmail » Enjoy! - The Gmail Team Please note that Themes are not available if you're using Internet Explorer 6.0. To take advantage of the latest Gmail features, please upgrade to a fully supported browser." [messagesArea.js:146:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
>>>>>> [ThunderAI] appendUserMessage done. [messagesArea.js:164:17](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/api_webchat/messagesArea.js)
[ThunderAI Logger | mzta-background] [Ollama API] Connection succeded! [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
After the [ThunderAI] ollama_api I'm ready message sent.
message, the background script is responding:
[ThunderAI Logger | mzta-background] Ollama API window ready. [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] message.window_id: 177 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
[ThunderAI Logger | mzta-background] >>>>>> createdTab3.id: 14 [mzta-logger.js:35:44](moz-extension://6553cb7f-1f22-4d1e-8acc-f040ff41c7cb/js/mzta-logger.js)
Here is where the addon is sending the prompt. Now I'm going to install the add-ons you pointed out and try again. It seems the Thunderbird internal messaging system is failing without error.
Well, that is indeed interesting.
Here's a test on my end doing the opposite thing - I've disabled TbSync and the CalDAV/CardDAV provider.
console-export-2024-10-2_16-20-19.txt
No change.
Further, in researching this, it turns out that those plugins are no longer necessary, since TB now supports CalDAV and CardDAV natively - so I've removed them.
Still doesn't work.
But, I have had another idea. This is a really, really old profile. Like, "for as long as TB has existed" levels of old. So I started TB with with -P
, created a new profile, and installed just ThunderAI from file - and it works.
I fired up meld and there are a lot of differences.
This makes me wonder if it's some old, stale config parameter.
I'm going to take the old profile and the new profile and apply settings from old->new and see if it breaks. Hmmm...
Ok, thank you for the feedback. In my setup it worked even with those add-ons. So it's consistent.
@Mikilio, could it be the same for you as well?
Even better! I had 2 copies of TB running - one with each profile - so that I could manually copy over all the settings.
Old profile - doesn't work New profile - works BOTH profiles running (2 windows) - doesn't work in EITHER profile.
100% reproduceable.
Now, this is not a complaint - it is merely data. Since I plan to have the two windows side by side and copy things over setting by setting and will then delete the old profile, this will likely all work out just fine. Or, if I break something, I'll have a clean profile from which we can try and determine which setting causes the breakage.
I'll let you know once this task is complete, but this will likely be tomorrow.
Thank you for your help Matt. I'll report your findings to the Thunderbird Team.
I'm not sure if it is related, though I suspect it is, but notifications on new messages haven't worked for me for about a decade.
My optimism may have been.. premature.
~/.thunderbird
directory let it create a new one. Still broken.At this point, I have no idea how I managed to make it work for the 5 minutes that it did...
I would scrap the idea that it is a deterministic error because under the right conditions I can make it happen completely randomly as well. Maybe some async weird stuff happening in the background. I will continue testing @micz updated code on the weekend.
I think you are correct.
When I have some time, I'll pull this repo and start poking. But I have no idea when that will be.
If there is something specific people want me to test, I am happy to make time - I just don't have a lot of spare time to go spelunking right now.
The problem
Sometimes, without warning, the chat window fails to insert the prompt. The chat window is functional, however. The error is not easily reproducible has it seems to happen randomly.
Which Operative System are you using?
NixOS
Which version of Thunderbird are you using?
Thunderbird 128.1.1esr
Which version of ThunderAI has the issue?
v2.2.0pre4
Which integration are you using?
Ollama API
Anything in the Thunderbird console logs that might be useful?
Additional information
I have managed to catch the error in debug by setting a breakpoint at this line
the variable
i18nStrings
was set to: