Open JarbasAl opened 10 months ago
I am testing this on the latest raspOVOS headless image.
Stopped the original messagebus, and started up this one. It has been running for about 10 minutes, asking things with voice, ovos-say-to and ovos-simple-cli
Seems to be working for now
Does not work for me.
works starting it from the command line. need to make a systemd service for it. replacing the command in the existing one does not work
Does not work for me.
Maybe a docker difference? It is working fine if I start it from the command line
strange. Changed the systemd service from notify
to simple
.
it said it failed to start, but seems to be running when I ps
it. And is working for me also
Does not work for me.
Maybe a docker difference? It is working fine if I start it from the command line
Bus is not running in a container during my tests.
My systemd service is showing failed, but it is running, all skills work and everything
Even with only one connection on the bus it crashes, the first one works (not always) but then the second one crashes the server.
[2024-01-03 10:35:45] [connect] WebSocket Connection [::ffff:127.0.0.1]:59484 v13 "" /core 101
Connection opened.
There are now 1 open connections.
message handler on the main thread
Message payload:
[2024-01-03 10:35:45] [control] Control frame received with opcode 8
{"context":{"session":{"active_skills":[],"context":{"frame_stack":[],"timeout":120},"lang":"en-us","pipeline":["converse","padatious_high","adapt","common_qa","fallback_high","padatious_medium","fallback_medium","padatious_low","fallback_low"],"session_id":"default","site_id":"unknown","stt":{"config":{},"plugin_id":"ovos-stt-plugin-server"},"tts":{"config":{"voice":"ryan-high"},"plugin_id":"ovos-tts-plugin-server"},"utterance_states":{}}},"data":{},"type":"ovos.session.sync"}
[2024-01-03 10:35:45] [frame_header] Dispatching write containing 1 message(s) containing 2 header bytes and 2 payload bytes
[2024-01-03 10:35:45] [frame_header] Header Bytes:
[0] (2) 88 02
[2024-01-03 10:35:45] [frame_payload] Payload Bytes:
[0] (2) [8] 03 E8
[2024-01-03 10:35:45] [disconnect] Disconnect close local:[1000] remote:[1000]
message handler on the main thread
Message payload:
{"context":{"session":{"active_skills":[],"context":{"frame_stack":[],"timeout":120},"lang":"en-us","pipeline":["converse","padatious_high","adapt","common_qa","fallback_high","padatious_medium","fallback_medium","padatious_low","fallback_low"],"session_id":"default","site_id":"unknown","stt":{"config":{},"plugin_id":"ovos-stt-plugin-server"},"tts":{"config":{"voice":"ryan-high"},"plugin_id":"ovos-tts-plugin-server"},"utterance_states":{}}},"data":{"lang":"en-us","utterances":["how are you"]},"type":"recognizer_loop:utterance"}
Connection closed.
There are now 0 open connections.
[2024-01-03 10:35:48] [connect] WebSocket Connection [::ffff:127.0.0.1]:59496 v13 "" /core 101
Connection opened.
There are now 1 open connections.
[2024-01-03 10:35:48] [control] Control frame received with opcode 8
message handler on the main thread[2024-01-03 10:35:48] [frame_header] Dispatching write containing 1 message(s) containing 2 header bytes and 2 payload bytes
[2024-01-03 10:35:48] [frame_header] Header Bytes:
[0] (2) 88 02
[2024-01-03 10:35:48] [frame_payload] Payload Bytes:
[0] (2) [8] 03 E8
Message payload:
{"context":{"session":{"active_skills":[],"context":{"frame_stack":[],"timeout":120},"lang":"en-us","pipeline":["converse","padatious_high","adapt","common_qa","fallback_high","padatious_medium","fallback_medium","padatious_low","fallback_low"],"session_id":"default","site_id":"unknown","stt":{"config":{},"plugin_id":"ovos-stt-plugin-server"},"tts":{"config":{"voice":"ryan-high"},"plugin_id":"ovos-tts-plugin-server"},"utterance_states":{}}},"data":{"lang":"en-us","utterances":["how are you"]},"type":"recognizer_loop:utterance"}
terminate called after throwing an instance of 'websocketpp::exception'
what(): invalid state
fish: Job 1, './ovos-bus-server' terminated by signal SIGABRT (Abort)
Maybe the solution will be one thread per connection and detach from it, that way the main thread will not get crashed.
@coderabbitai review
This update introduces a mutex lock using std::lock_guard
within the WebsocketServer::onMessage
method to ensure thread safety when handling messages. By locking connectionListMutex
before iterating over messageHandlers
, the change safeguards concurrent access, preventing potential data races or inconsistencies.
Files | Change Summary |
---|---|
server/WebsocketServer.cpp |
Added std::lock_guard to lock connectionListMutex in WebsocketServer::onMessage for thread-safe handling of messageHandlers . |
In webs and locks, our code does dance,
A mutex now takes its stance.
Threads in harmony, safe and bright,
No race conditions in sight.
Withlock_guard
's gentle embrace,
Our server runs, a smoother pace.
🌐🔒✨
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
maybe fixes #2 / #3
Summary by CodeRabbit