jitsi-contrib / jitsi-helm

A helm chart to deploy Jitsi to Kubernetes
MIT License
136 stars 75 forks source link

Request for Help: WebSocket Connection Issue with Jitsi on Azure Kubernetes #112

Closed Bananenbrot1 closed 5 months ago

Bananenbrot1 commented 5 months ago

Hi everyone,

Firstly, I want to commend the excellent work on the Chart! It's been incredibly helpful.

I'm currently facing an issue with my Jitsi setup on an Azure Kubernetes Cluster where WebSocket support, intended for higher video quality, consistently fails. Here's the error I'm encountering:

BridgeChannel.js:92 WebSocket connection to 'wss://<url>/colibri-ws/10.244.8.10/a0f267fa8b539273/35fb3c8a?pwd=xxx' failed:

To provide some context, the Jitsi Video Bridge (JVB) is publicly available via hostport. Additionally, here’s the ingress configuration for web access:

ingress:
    enabled: true
    ingressClassName: azure-application-gateway
    annotations:
      #kubernetes.io/ingress.class: azure/application-gateway
      cert-manager.io/cluster-issuer: letsencrypt-prod
      kubernetes.io/tls-acme: "true"
    hosts:
      - host: host
        paths: ["/"]
    tls:
      - secretName: jitsi-web-tls
        hosts:
          - host

Could anyone suggest any modifications to the ingress settings or other configuration changes that might help resolve the WebSocket connectivity issue? I'm wondering if specific adjustments are necessary for the ingress to support WebSocket connections.

Thank you for your time and any assistance you can provide!

emrahcom commented 5 months ago

colibri websocket is a permanent connection between the client and the server. If ingress drops the connections after a while, this will break the signaling.

You may try to increase timeouts:

ingress:
  annotations:
    nginx.ingress.kubernetes.io/proxy-read-timeout: "7200"
    nginx.ingress.kubernetes.io/proxy-sent-timeout: "7200"
Bananenbrot1 commented 5 months ago

Hi emrahcom,

Thanks for the suggestion! I tried updating my configuration as you recommended, but unfortunately, it didn't resolve the issue. The error persists, and it seems like the WebSocket connection isn't being established at all. Any other ideas or adjustments I could try?

Appreciate your help!

spijet commented 5 months ago

Hello @Bananenbrot1!

You can try using the current main branch snapshot with these values:

# jitsi-values.yaml
websockets:
  colibri:
    enabled: false
  xmpp:
    enabled: false

Current main snapshot sets all the necessary environment variables required for proper function of legacy datachannels which work well when WebSockets fail.

If it works for you, you can keep using this version or revert to the latest release and add these options to your values:

# jitsi-values.yaml
extraCommonEnvs:
  ENABLE_SCTP: 'true'
  ENABLE_COLIBRI_WEBSOCKET: 'false'
  JVB_PREFER_SCTP: 'true'
emrahcom commented 5 months ago

It may be also related with COLIBRI_WEBSOCKET_REGEX depending on the Jitsi version. https://github.com/jitsi/docker-jitsi-meet/blob/master/web/rootfs/defaults/meet.conf#L3C5-L3C28

It should be the following for this case:

COLIBRI_WEBSOCKET_REGEX=[0-9.]+
Bananenbrot1 commented 5 months ago

Hi @spijet,

Thanks for the quick response! I implemented the extraCommonEnvs as you suggested, and it indeed resolved the error. Now, I can successfully receive and stream videos at 360p, which is a small but appreciated improvement 👍.

However, I'm hesitant about using a legacy version since I'm aiming to build a clean, modern stack. Do you have any insights on how I might enable the websockets to function properly in this setup?

Do you think upgrading to the head of main of the chart would solve the issue?

Hi @emrahcom,

Thank you as well for your prompt reply. I tried setting the environment variable as you advised, but unfortunately, there's been no change—I'm still seeing the same error message. Any other suggestions?

Thanks again for all the help!

spijet commented 5 months ago

It's "legacy" in the sense that it's been there before the WebSockets support was implemented. IIRC, meet.jit.si currently uses both WSS and REST/SCTP in order to maximize the support among different client browsers and apps.

As for the 360p video resolution — make sure that you're not looking at the video feed from mobile clients, as they're known to limit their video upload to 360p max, in order to conserve traffic/bandwidth and go easier on the battery (see https://github.com/jitsi/jitsi-meet/issues/5808#issuecomment-700532256 for details).

Can you please share the full error text from the browser console and complete contents of your chart values, with sensitive info (like usernames/passwords/urls/ips) masked?

Bananenbrot1 commented 5 months ago

Hey @spijet

sure! Here is our values.yaml file.

# Default values for jitsi-meet.
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.

global:
  # Set your cluster's DNS domain here.
  # "cluster.local" should work for most environments.
  # Set to "" to disable the use of FQDNs (default in older chart versions).
  clusterDomain: cluster.local
  podLabels: {}
  podAnnotations: {}
  releaseSecretsOverride:
    enabled: false
    #Support environment variables from pre-created secrets, such as 1Password operator
    #extraEnvFrom:
    #  - secretRef:
    #      name: '{{ include "prosody.fullname" . }}-overrides'
    #      optional: true

imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""

enableAuth: false
enableGuests: true
# Where Jitsi Web UI is made available
# such as jitsi.example.com
publicURL: "pubUrl"

tz: Europe/Amsterdam

image:
  pullPolicy: IfNotPresent

## WebSocket configuration:
#
#  Both Colibri and XMPP WebSockets are disabled by default,
#  since some LoadBalancer / Reverse Proxy setups can't pass
#  WebSocket connections properly, which might result in breakage
#  for some clients.
#
#  Enable both Colibri and XMPP WebSockets to replicate the current
#  upstream `meet.jit.si` setup. Keep both disabled to replicate
#  older setups which might be more compatible in some cases.
websockets:
  ## Colibri (JVB signalling):
  colibri:
    enabled: true
  ## XMPP (Prosody signalling):
  xmpp:
    enabled: true

web:
  replicaCount: 1
  image:
    repository: jitsi/web

  custom:
    contInit:
      _10_config: ""
    defaults:
      _default: ""
      _ffdhe2048_txt: ""
      _interface_config_js: ""
      _meet_conf: ""
      _nginx_conf: ""
      _settings_config_js: ""
      _ssl_conf: ""
      _system_config_js: ""

  extraEnvs: {}
  service:
    type: ClusterIP
    port: 80
    ## If you want to expose the Jitsi Web service directly
    #  (bypassing the Ingress Controller), use this:
    #
    # type: NodePort
    # nodePort: 30580
    # port: 80
    externalIPs: []

  ingress:
    enabled: true
    ingressClassName: azure-application-gateway
    annotations:
      #kubernetes.io/ingress.class: azure/application-gateway
      cert-manager.io/cluster-issuer: letsencrypt-prod
      kubernetes.io/tls-acme: "true"
      nginx.ingress.kubernetes.io/proxy-read-timeout: "7200"
      nginx.ingress.kubernetes.io/proxy-sent-timeout: "7200"
    hosts:
      - host: host
        paths: ["/"]
    tls:
      - secretName: jitsi-web-tls
        hosts:
          - host

  # Useful for ingresses that don't support http-to-https redirect by themself, (namely: GKE),
  httpRedirect: false

  # When tls-termination by the ingress is not wanted, enable this and set web.service.type=Loadbalancer
  httpsEnabled: false

  ## Resolver IP for nginx.
  #
  #  Starting with version `stable-8044`, the web container can
  #  auto-detect the nameserver from /etc/resolv.conf.
  #  Use this option if you want to override the nameserver IP.
  #
  # resolverIP: 10.43.0.10

  livenessProbe:
    httpGet:
      path: /
      port: 80
  readinessProbe:
    httpGet:
      path: /
      port: 80

  podLabels: {}
  podAnnotations: {}
  podSecurityContext:
    {}
    # fsGroup: 2000

  securityContext:
    {}
    # capabilities:
    #   drop:
    #   - ALL
    # readOnlyRootFilesystem: true
    # runAsNonRoot: true
    # runAsUser: 1000

  resources:
    {}
    # We usually recommend not to specify default resources and to leave this as a conscious
    # choice for the user. This also increases chances charts run on environments with little
    # resources, such as Minikube. If you do want to specify resources, uncomment the following
    # lines, adjust them as necessary, and remove the curly braces after 'resources:'.
    # limits:
    #   cpu: 100m
    #   memory: 128Mi
    # requests:
    #   cpu: 100m
    #   memory: 128Mi

  nodeSelector: {}

  tolerations: []

  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
          - matchExpressions:
              - key: agentpool
                operator: In
                values:
                  - nodepool

jicofo:
  replicaCount: 1
  image:
    repository: jitsi/jicofo

  custom:
    contInit:
      _10_config: ""
    defaults:
      _jicofo_conf: ""
      _logging_properties: ""

  xmpp:
    password: "xxx"
    componentSecret:

  livenessProbe:
    tcpSocket:
      port: 8888

  readinessProbe:
    tcpSocket:
      port: 8888

  podLabels: {}
  podAnnotations: {}
  podSecurityContext: {}
  securityContext: {}
  resources: {}
  nodeSelector: {}
  tolerations: []
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
          - matchExpressions:
              - key: agentpool
                operator: In
                values:
                  - nodepool
  extraEnvs: {}

jvb:
  useHostPort: true
  UDPPort: 32768

  publicIPs:
    publicIPs
  replicaCount: 1
  image:
    repository: jitsi/jvb

  xmpp:
    user: jvb
    password: "xxx"

  tolerations:
    - key: "type"
      operator: "Equal"
      value: "jvb"
      effect: "NoSchedule"

  affinity:
    podAntiAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        - labelSelector:
            matchExpressions:
              - key: app.kubernetes.io/component
                operator: In
                values:
                  - jvb
          topologyKey: "kubernetes.io/hostname"
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
          - matchExpressions:
              - key: type
                operator: In
                values:
                  - jvb

  ## Use a STUN server to help some users punch through some
  #  especially nasty NAT setups. Usually makes sense for P2P calls.
  stunServers: "meet-jit-si-turnrelay.jitsi.net:443"

  metrics:
    enabled: true
    image:
      repository: docker.io/systemli/prometheus-jitsi-meet-exporter
      tag: 1.2.3
      pullPolicy: IfNotPresent

    prometheusAnnotations: false
    serviceMonitor:
      enabled: true
      selector:
        release: monitoring
      interval: 10s
      # honorLabels: false

    grafanaDashboards:
      enabled: true
      labels:
        grafana_dashboard: "1"
      annotations: {}
octo:
  enabled: true

jigasi:
  ## Enabling Jigasi will allow regular SIP clients to join Jitsi meetings
  ## or nearly real-time transcription.
  enabled: false

  ## Use external Jigasi installation.
  ## This setting skips the creation of Jigasi Deployment altogether,
  ## instead creating just the config secret and enabling services.
  ## Defaults to disabled (use bundled Jigasi).
  useExternalJigasi: false

  replicaCount: 1
  image:
    repository: jitsi/jigasi

  breweryMuc: jigasibrewery

  ## jigasi XMPP user credentials:
  xmpp:
    user: jigasi
    password:

  livenessProbe:
    tcpSocket:
      port: 8788
  readinessProbe:
    tcpSocket:
      port: 8788

  podLabels: {}
  podAnnotations: {}
  podSecurityContext: {}
  securityContext: {}
  resources: {}
  nodeSelector: {}
  tolerations: []
  affinity: {}
  extraEnvs: {}

jibri:
  ## Enabling Jibri will allow users to record
  ## and/or stream their meetings (e.g. to YouTube).
  enabled: false

  ## Use external Jibri installation.
  ## This setting skips the creation of Jibri Deployment altogether,
  ## instead creating just the config secret
  ## and enabling recording/streaming services.
  ## Defaults to disabled (use bundled Jibri).
  useExternalJibri: false

  ## Enable single-use mode for Jibri.
  ## With this setting enabled, every Jibri instance
  ## will become "expired" after being used once (successfully or not)
  ## and cleaned up (restarted) by Kubernetes.
  ##
  ## Note that detecting expired Jibri, restarting and registering it
  ## takes some time, so you'll have to make sure you have enough
  ## instances at your disposal.
  ## You might also want to make LivenessProbe fail faster.
  singleUseMode: false

  ## Enable recording service.
  ## Set this to true/false to enable/disable local recordings.
  ## Defaults to enabled (allow local recordings).
  recording: true

  ## Enable livestreaming service.
  ## Set this to true/false to enable/disable live streams.
  ## Defaults to disabled (livestreaming is forbidden).
  livestreaming: false

  ## Enable multiple Jibri instances.
  ## If enabled (i.e. set to 2 or more), each Jibri instance
  ## will get an ID assigned to it, based on pod name.
  ## Multiple replicas are recommended for single-use mode.
  replicaCount: 1

  ## Enable persistent storage for local recordings.
  ## If disabled, jibri pod will use a transient
  ## emptyDir-backed storage instead.
  persistence:
    enabled: false
    size: 4Gi
    ## Set this to existing PVC name if you have one.
    existingClaim:
    storageClassName:

  shm:
    ## Set to true to enable "/dev/shm" mount.
    ## May be required by built-in Chromium.
    enabled: false
    ## If "true", will use host's shared memory dir,
    ## and if "false" — an emptyDir mount.
    # useHost: false
    # size: 256Mi

  ## Configure the update strategy for Jibri deployment.
  ## This may be useful depending on your persistence settings,
  ## e.g. when you use ReadWriteOnce PVCs.
  ## Default strategy is "RollingUpdate", which keeps
  ## the old instances up until the new ones are ready.
  # strategy:
  #   type: RollingUpdate

  image:
    repository: jitsi/jibri

  podLabels: {}
  podAnnotations: {}

  breweryMuc: jibribrewery
  timeout: 90

  ## jibri XMPP user credentials:
  xmpp:
    user: jibri
    password:

  ## recorder XMPP user credentials:
  recorder:
    user: recorder
    password:

  livenessProbe:
    initialDelaySeconds: 5
    periodSeconds: 5
    failureThreshold: 2
    exec:
      command:
        - /bin/bash
        - "-c"
        - >-
          curl -sq localhost:2222/jibri/api/v1.0/health
          | jq '"\(.status.health.healthStatus) \(.status.busyStatus)"'
          | grep -qP 'HEALTHY (IDLE|BUSY)'

  readinessProbe:
    initialDelaySeconds: 5
    periodSeconds: 5
    failureThreshold: 2
    exec:
      command:
        - /bin/bash
        - "-c"
        - >-
          curl -sq localhost:2222/jibri/api/v1.0/health
          | jq '"\(.status.health.healthStatus) \(.status.busyStatus)"'
          | grep -qP 'HEALTHY (IDLE|BUSY)'

  extraEnvs: {}
  custom:
    contInit:
      _10_config: ""
    defaults:
      _autoscaler_sidecar_config: ""
      _jibri_conf: ""
      _logging_properties: ""
      _xorg_video_dummy_conf: ""

serviceAccount:
  # Specifies whether a service account should be created
  create: true
  # Annotations to add to the service account
  annotations: {}
  # The name of the service account to use.
  # If not set and create is true, a name is generated using the fullname template
  name:

xmpp:
  domain: meet.jitsi
  authDomain:
  mucDomain:
  internalMucDomain:
  guestDomain:

extraCommonEnvs:
  START_VIDEO_MUTED: "50"
  # COLIBRI_WEBSOCKET_REGEX: "[0-9.]+"
  # ENABLE_SCTP: "true"
  # ENABLE_COLIBRI_WEBSOCKET: "false"
  # JVB_PREFER_SCTP: "true"

prosody:
  enabled: true
  useExternalProsody: false
  server:
  extraEnvFrom:
    # - secretRef:
    #     name: '{{ include "prosody.fullname" . }}-jibri'
    - secretRef:
        name: '{{ include "prosody.fullname" . }}-jicofo'
    # - secretRef:
    #     name: '{{ include "prosody.fullname" . }}-jigasi'
    - secretRef:
        name: '{{ include "prosody.fullname" . }}-jvb'
    - configMapRef:
        name: '{{ include "prosody.fullname" . }}-common'
  image:
    repository: jitsi/prosody
  # service:
  #   ports:
  # If Prosody c2s in needed on private net outside the cluster
  #     xmppc2snodePort: 30522
  custom:
    contInit:
      _10_config: ""
    defaults:
      _prosody_cfg_lua: ""
      _saslauthd_conf: ""
      _jitsi_meet_cfg_lua: ""
    tag: "stable-9111"
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
          - matchExpressions:
              - key: agentpool
                operator: In
                values:
                  - nodepool

And here is the browser console log:

LocalStatsCollector.js:169 The AudioContext was not allowed to start. It must be resumed (or created) after a user gesture on the page. https://goo.gl/7K7WLu
Bi.connectAudioContext @ LocalStatsCollector.js:169
Logger.js:155 2024-04-17T13:29:02.484Z [features/base/redux] <Object.persistState>:  redux state persisted. 9f2be0c6182ebba582d855acb22e5dd1 -> 7163dd433209cfe005396baf50e2ddd1
Logger.js:155 2024-04-17T13:29:02.521Z [JitsiMeetJS.ts] <Object.init>:  This appears to be chrome, ver: 123.0.0.0
Logger.js:155 2024-04-17T13:29:02.521Z [modules/RTC/RTCUtils.js] <Nr.init>:  Disable AP: false
Logger.js:155 2024-04-17T13:29:02.522Z [modules/RTC/RTCUtils.js] <Nr.init>:  Stereo: false
Logger.js:155 2024-04-17T13:29:02.523Z [features/base/lib-jitsi-meet] lib-jitsi-meet version:456e45ee
Logger.js:155 2024-04-17T13:29:02.524Z [features/base/media] Start muted: audio, 
Logger.js:155 2024-04-17T13:29:02.525Z [features/base/media] Start audio only set to false
Logger.js:155 2024-04-17T13:29:02.525Z [features/analytics] Initialized 0 analytics handlers
Logger.js:155 2024-04-17T13:29:02.623Z [index.web] <HTMLDocument.<anonymous>>:  (TIME) document ready:   875.1000000238419
lang/countries.json:1 

       Failed to load resource: the server responded with a status of 404 (Not Found)
Logger.js:155 2024-04-17T13:29:02.871Z [modules/RTC/RTCUtils.js] Audio output device set to default
test:55 Service worker registered. ServiceWorkerRegistration
Logger.js:155 2024-04-17T13:29:02.924Z [modules/RTC/RTCUtils.js] <ud>:  list of media devices has changed: Array(8)
Logger.js:155 2024-04-17T13:29:02.936Z [modules/RTC/RTCUtils.js] <Nr.<anonymous>>:  Got media constraints:  {"video":{"height":{"ideal":720},"width":{"ideal":1280},"facingMode":"user"},"audio":{"autoGainControl":true,"echoCancellation":true,"noiseSuppression":true}}
injected.js:4 Retrieving "b5x-stateful-inline-icon" flag errored: timed out - falling back
qy @ injected.js:4
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:04.479Z [modules/RTC/RTCUtils.js] onUserMediaSuccess
Logger.js:155 2024-04-17T13:29:04.489Z [features/base/redux] <Object.persistState>:  redux state persisted. 7163dd433209cfe005396baf50e2ddd1 -> 19c15861d98895c0aa9e2d7d8c9746e4
Logger.js:155 2024-04-17T13:29:04.505Z [features/base/media] Sync audio track muted state to muted
Logger.js:155 2024-04-17T13:29:04.508Z [modules/RTC/JitsiLocalTrack.js] Mute LocalTrack[1,audio]: true
Logger.js:155 2024-04-17T13:29:04.544Z [features/base/connection] Using service URL https://url/http-bind
Logger.js:155 2024-04-17T13:29:04.546Z [modules/xmpp/moderator.js] <new is>:  Using xmpp for conference requests.
Logger.js:155 2024-04-17T13:29:04.547Z [modules/xmpp/xmpp.js] <ja._initStrophePlugins>:  P2P STUN servers:  Array(1)
Logger.js:155 2024-04-17T13:29:04.554Z [modules/xmpp/xmpp.js] <ja.connectionHandler>:  (TIME) Strophe connecting:    2804.600000023842
Logger.js:155 2024-04-17T13:29:05.154Z [modules/xmpp/XmppConnection.js] <Jo._maybeEnableStreamResume>:  Stream resume enabled, but WebSockets are not enabled
r @ Logger.js:155
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:05.154Z [modules/xmpp/strophe.ping.js] <$o.startInterval>:  XMPP pings will be sent every 10000 ms
Logger.js:155 2024-04-17T13:29:05.154Z [modules/xmpp/xmpp.js] <ja.connectionHandler>:  (TIME) Strophe connected:     3405.899999976158
Logger.js:155 2024-04-17T13:29:05.154Z [modules/xmpp/xmpp.js] <ja.connectionHandler>:  My Jabber ID: g2ulntb-ta9r6-gikbncjtar@meet.jitsi/wvUoCJIvf2xN
Logger.js:155 2024-04-17T13:29:05.174Z [modules/RTC/CodecSelection.js] <new Ja>:  Codec preference order for jvb connection is vp9,vp8,h264,av1
Logger.js:155 2024-04-17T13:29:05.175Z [modules/RTC/CodecSelection.js] <new Ja>:  Codec preference order for p2p connection is vp9,vp8,h264,av1
Logger.js:155 2024-04-17T13:29:05.175Z [modules/xmpp/xmpp.js] <ja.createRoom>:  JID g2ulntb-ta9r6-gikbncjtar@meet.jitsi/wvUoCJIvf2xN using MUC nickname 0c594c68
Logger.js:155 2024-04-17T13:29:05.175Z [modules/xmpp/ChatRoom.js] <new xs>:  Joining MUC as test@muc.meet.jitsi/0c594c68
Logger.js:155 2024-04-17T13:29:05.176Z [modules/statistics/AvgRTPStatsReporter.js] <new Oh>:  Avg RTP stats will be calculated every 15 samples
Logger.js:155 2024-04-17T13:29:05.176Z [JitsiConference.js] <new du>:  backToP2PDelay: 5
Logger.js:155 2024-04-17T13:29:05.176Z [JitsiConference.js] <new du>:  End-to-End Encryption is supported
Logger.js:155 2024-04-17T13:29:05.178Z [JitsiConference.js] <du._doReplaceTrack>:  _doReplaceTrack - no JVB JingleSession
Logger.js:155 2024-04-17T13:29:05.178Z [JitsiConference.js] <du._doReplaceTrack>:  _doReplaceTrack - no P2P JingleSession
Logger.js:155 2024-04-17T13:29:05.185Z [modules/xmpp/moderator.js] <conferenceRequestSent.Promise.then.conferenceRequestSent>:  Sending conference request over XMPP to focus.meet.jitsi
Logger.js:155 2024-04-17T13:29:05.188Z [conference.js] Initialized with 1 local tracks
Logger.js:155 2024-04-17T13:29:05.228Z [modules/xmpp/strophe.jingle.js] getting turn credentials with extdisco:2 failed, trying extdisco:1
r @ Logger.js:155
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:05.234Z [modules/xmpp/xmpp.js] <ja._maybeSendDeploymentInfoStat>:  {"region":"all","userRegion":"all","id":"deployment_info"}
Logger.js:155 2024-04-17T13:29:05.368Z [modules/xmpp/moderator.js] <is._handleSuccess>:  Adding focus JID: focus@auth.meet.jitsi
Logger.js:155 2024-04-17T13:29:05.368Z [modules/xmpp/moderator.js] <is._handleSuccess>:  Authentication enabled: false
Logger.js:155 2024-04-17T13:29:05.369Z [modules/xmpp/moderator.js] <is._handleSuccess>:  Sip gateway enabled: undefined
Logger.js:155 2024-04-17T13:29:05.369Z [modules/xmpp/moderator.js] <is._handleSuccess>:  Conference-request successful, ready to join the MUC.
Logger.js:155 2024-04-17T13:29:05.369Z [modules/xmpp/strophe.jingle.js] getting turn credentials failed
r @ Logger.js:155
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:05.369Z [modules/xmpp/strophe.jingle.js] is mod_turncredentials or similar installed and configured?
r @ Logger.js:155
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:05.404Z [modules/xmpp/ChatRoom.js] <xs.onPresence>:  (TIME) MUC join started:     3656
Logger.js:155 2024-04-17T13:29:05.437Z [modules/xmpp/ChatRoom.js] <xs.onPresence>:  entered test@muc.meet.jitsi/focus Object
Logger.js:155 2024-04-17T13:29:05.438Z [modules/version/ComponentsVersions.js] Got focus version: 1.0.1057
Logger.js:155 2024-04-17T13:29:05.440Z [JitsiConference.js] <du._updateProperties>:  Audio unmute permissions set by Jicofo to false
Logger.js:155 2024-04-17T13:29:05.440Z [JitsiConference.js] <du._updateProperties>:  Video unmute permissions set by Jicofo to false
Logger.js:155 2024-04-17T13:29:05.441Z [modules/xmpp/ChatRoom.js] <xs.onPresence>:  Jicofo supports restart by terminate: true
Logger.js:155 2024-04-17T13:29:05.441Z [conference.js] <o.<anonymous>>:  My role changed, new role: none
Logger.js:155 2024-04-17T13:29:05.442Z [modules/xmpp/ChatRoom.js] <xs.onPresence>:  (TIME) MUC joined:   3693.699999988079
Logger.js:155 2024-04-17T13:29:05.455Z [modules/xmpp/ChatRoom.js] <xs.onMessage>:  Subject is changed to 
Logger.js:155 2024-04-17T13:29:05.604Z [conference.js] <o.<anonymous>>:  My role changed, new role: moderator
Logger.js:155 2024-04-17T13:29:12.849Z [modules/xmpp/ChatRoom.js] <xs.onPresence>:  entered test@muc.meet.jitsi/0face48b Object
Logger.js:155 2024-04-17T13:29:12.864Z [conference.js] <o.<anonymous>>:  USER 0face48b connected: _r
Logger.js:155 2024-04-17T13:29:12.864Z [JitsiConference.js] <du._maybeStartOrStopP2P>:  Will start P2P with: test@muc.meet.jitsi/0face48b
Logger.js:155 2024-04-17T13:29:12.865Z [JitsiConference.js] <du._startP2PSession>:  Created new P2P JingleSession test@muc.meet.jitsi/0c594c68 test@muc.meet.jitsi/0face48b
Logger.js:155 2024-04-17T13:29:12.877Z [modules/RTC/TraceablePeerConnection.js] <new Tl>:  Using RTCRtpTransceiver#setCodecPreferences for codec selection
Logger.js:155 2024-04-17T13:29:12.877Z [modules/RTC/TraceablePeerConnection.js] <new Tl>:  Create new TPC[id=1,type=P2P]
Logger.js:155 2024-04-17T13:29:12.877Z [JitsiConference.js] <du._startP2PSession>:  Starting CallStats for P2P connection...
Logger.js:155 2024-04-17T13:29:12.877Z [modules/RTC/TraceablePeerConnection.js] <Tl.addTrack>:  TPC[id=1,type=P2P] adding LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:12.971Z [modules/xmpp/strophe.jingle.js] <Ta.onJingle>:  Found a JSON-encoded element in session-initiate, translating to standard Jingle.
Logger.js:155 2024-04-17T13:29:12.971Z [modules/xmpp/strophe.jingle.js] <Ta.onJingle>:  (TIME) received session-initiate:    11222.300000011921
Logger.js:155 2024-04-17T13:29:12.973Z [modules/RTC/TraceablePeerConnection.js] <new Tl>:  Using RTCRtpTransceiver#setCodecPreferences for codec selection
Logger.js:155 2024-04-17T13:29:12.973Z [modules/RTC/TraceablePeerConnection.js] <new Tl>:  Create new TPC[id=2,type=JVB]
Logger.js:155 2024-04-17T13:29:12.973Z [JitsiConference.js] <du._setBridgeChannel>:  SCTP: offered=false, prefered=false
Logger.js:155 2024-04-17T13:29:12.973Z [JitsiConference.js] Using colibri-ws url wss://url/colibri-ws/10.244.8.12/e615b440c76e4752/0c594c68?pwd=5oo87muk0ppjcmqpbm1gkruca4
Logger.js:155 2024-04-17T13:29:12.975Z [modules/RTC/TraceablePeerConnection.js] <Tl.addTrack>:  TPC[id=2,type=JVB] adding LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:12.978Z [JitsiConference.js] <du._acceptJvbIncomingCall>:  Starting CallStats for JVB connection...
Logger.js:155 2024-04-17T13:29:13.008Z [modules/xmpp/JingleSessionPC.js] <ya.sendSessionAccept>:  JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] Sending session-accept
Logger.js:155 2024-04-17T13:29:13.012Z [modules/xmpp/JingleSessionPC.js] <peerconnection.oniceconnectionstatechange>:  (TIME) ICE checking JVB:  11263.300000011921
BridgeChannel.js:92 WebSocket connection to 'wss://url/colibri-ws/10.244.8.12/e615b440c76e4752/0c594c68?pwd=5oo87muk0ppjcmqpbm1gkruca4' failed: 
_initWebSocket @ BridgeChannel.js:92
Logger.js:155 2024-04-17T13:29:13.078Z [modules/RTC/BridgeChannel.js] <e.onclose>:  Channel closed: 1006 
r @ Logger.js:155
Show 1 more frame
Show less
Logger.js:155 2024-04-17T13:29:13.089Z [modules/xmpp/JingleSessionPC.js] <ya.sendIceCandidates>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] sendIceCandidates [{"candidate":"candidate:2349445023 1 udp 2122194687 192.168.64.1 60101 typ host generation 0 ufrag iEQ4 network-id 1","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:745614553 1 udp 2122063615 192.168.178.99 53495 typ host generation 0 ufrag iEQ4 network-id 3 network-cost 10","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:744239317 1 udp 2122265343 fdea:e860:e46e:6743:1ce8:336a:106d:393c 54919 typ host generation 0 ufrag iEQ4 network-id 2","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:3170537391 1 udp 2122131711 2a02:8109:9db3:8500:828:e3c7:edbf:902 61816 typ host generation 0 ufrag iEQ4 network-id 4 network-cost 10","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:4269648214 1 udp 1685855999 77.20.140.148 53495 typ srflx raddr 192.168.178.99 rport 53495 generation 0 ufrag iEQ4 network-id 3 network-cost 10","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:4073082119 1 tcp 1518214911 192.168.64.1 9 typ host tcptype active generation 0 ufrag iEQ4 network-id 1","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:1388236353 1 tcp 1518083839 192.168.178.99 9 typ host tcptype active generation 0 ufrag iEQ4 network-id 3 network-cost 10","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:1385419341 1 tcp 1518285567 fdea:e860:e46e:6743:1ce8:336a:106d:393c 9 typ host tcptype active generation 0 ufrag iEQ4 network-id 2","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"},{"candidate":"candidate:3258281271 1 tcp 1518151935 2a02:8109:9db3:8500:828:e3c7:edbf:902 9 typ host tcptype active generation 0 ufrag iEQ4 network-id 4 network-cost 10","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"iEQ4"}]
Logger.js:155 2024-04-17T13:29:13.222Z [modules/xmpp/JingleSessionPC.js] <peerconnection.oniceconnectionstatechange>:  (TIME) ICE connected JVB:     11473.5
Logger.js:155 2024-04-17T13:29:13.265Z [modules/xmpp/strophe.jingle.js] <Ta.onJingle>:  Found a JSON-encoded element in source-add, translating to standard Jingle.
Logger.js:155 2024-04-17T13:29:13.399Z [modules/xmpp/JingleSessionPC.js] <ya.setVideoCodecs>:  JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] setVideoCodecs: vp9,vp8,h264,av1
Logger.js:155 2024-04-17T13:29:13.399Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.399Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.403Z [modules/xmpp/JingleSessionPC.js] JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] Processing addRemoteStream
Logger.js:155 2024-04-17T13:29:13.410Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.410Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"networkPriority":"low","priority":"low","maxBitrate":1200000,"scaleResolutionDownBy":1,"scalabilityMode":"L3T3_KEY"},{"active":false,"adaptivePtime":false,"networkPriority":"low","priority":"low","maxBitrate":0},{"active":false,"adaptivePtime":false,"networkPriority":"low","priority":"low","maxBitrate":0}]
Logger.js:155 2024-04-17T13:29:13.414Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.415Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"maxBitrate":1200000,"networkPriority":"low","priority":"low","scaleResolutionDownBy":1}]
Logger.js:155 2024-04-17T13:29:13.415Z [modules/RTC/TraceablePeerConnection.js] <Tl._remoteTrackAdded>:  TPC[id=2,type=JVB] Received track event for remote stream[id=0face48b-video-0-1,type=video]
Logger.js:155 2024-04-17T13:29:13.416Z [modules/RTC/TraceablePeerConnection.js] <Tl._createRemoteTrack>:  TPC[id=2,type=JVB] creating remote track[endpoint=0face48b,ssrc=4181336143,type=video,sourceName=0face48b-v0]
Logger.js:155 2024-04-17T13:29:13.417Z [modules/xmpp/JingleSessionPC.js] <ya.setReceiverVideoConstraint>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] setReceiverVideoConstraint - constraints: {}
Logger.js:155 2024-04-17T13:29:13.421Z [modules/connectivity/TrackStreamingStatus.ts] <new tl>:  RtcMuteTimeout set to: 10000
Logger.js:155 2024-04-17T13:29:13.445Z [modules/xmpp/JingleSessionPC.js] JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] addRemoteStream - OK
Logger.js:155 2024-04-17T13:29:13.466Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.466Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"maxBitrate":1200000,"networkPriority":"low","priority":"low","scalabilityMode":"L3T3_KEY","scaleResolutionDownBy":1},{"active":false,"adaptivePtime":false,"maxBitrate":0,"networkPriority":"low","priority":"low"},{"active":false,"adaptivePtime":false,"maxBitrate":0,"networkPriority":"low","priority":"low"}]
Logger.js:155 2024-04-17T13:29:13.539Z [modules/xmpp/JingleSessionPC.js] JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] Got RESULT for "session-initiate"
Logger.js:155 2024-04-17T13:29:13.539Z [JitsiConference.js] <du.onCallAccepted>:  P2P setAnswer
Logger.js:155 2024-04-17T13:29:13.543Z [modules/xmpp/JingleSessionPC.js] <ya.setReceiverVideoConstraint>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] setReceiverVideoConstraint - constraints: {}
Logger.js:155 2024-04-17T13:29:13.545Z [modules/RTC/TraceablePeerConnection.js] <Tl._remoteTrackAdded>:  TPC[id=1,type=P2P] Received track event for remote stream[id=0face48b-video-0-2,type=video]
Logger.js:155 2024-04-17T13:29:13.545Z [modules/RTC/TraceablePeerConnection.js] <Tl._createRemoteTrack>:  TPC[id=1,type=P2P] creating remote track[endpoint=0face48b,ssrc=1281408767,type=video,sourceName=0face48b-v0]
Logger.js:155 2024-04-17T13:29:13.545Z [JitsiConference.js] <du.onRemoteTrackAdded>:  Trying to add remote P2P track, when not in P2P - IGNORED
Logger.js:155 2024-04-17T13:29:13.594Z [modules/RTC/JitsiRemoteTrack.js] <cl._playCallback>:  (TIME) Render video:   11845.300000011921
Logger.js:155 2024-04-17T13:29:13.594Z [modules/RTC/JitsiRemoteTrack.js] <cl._playCallback>:  (TIME) TTFM video:     2771.7999999523163
Logger.js:155 2024-04-17T13:29:13.686Z [modules/RTC/JitsiRemoteTrack.js] <cl.setMute>:  Mute RemoteTrack[userID: 0face48b, type: video, ssrc: 1281408767, p2p: true, sourceName: 0face48b-v0, status: {readyState: live, muted: false, enabled: true}]: false
Logger.js:155 2024-04-17T13:29:13.686Z [modules/RTC/JitsiRemoteTrack.js] <cl.setMute>:  Mute RemoteTrack[userID: 0face48b, type: video, ssrc: 4181336143, p2p: false, sourceName: 0face48b-v0, status: {readyState: live, muted: false, enabled: true}]: false
Logger.js:155 2024-04-17T13:29:13.699Z [JitsiConference.js] <du.onTransportInfo>:  P2P addIceCandidates
Logger.js:155 2024-04-17T13:29:13.704Z [modules/xmpp/JingleSessionPC.js] <peerconnection.oniceconnectionstatechange>:  (TIME) ICE checking P2P:  11955.899999976158
Logger.js:155 2024-04-17T13:29:13.705Z [modules/xmpp/JingleSessionPC.js] <peerconnection.oniceconnectionstatechange>:  (TIME) ICE connected P2P:     11956.800000011921
Logger.js:155 2024-04-17T13:29:13.705Z [JitsiConference.js] <du._setP2PStatus>:  Peer to peer connection established!
Logger.js:155 2024-04-17T13:29:13.707Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.708Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.710Z [JitsiConference.js] <du._removeRemoteTracks>:  Removing remote JVB track: RemoteTrack[userID: 0face48b, type: video, ssrc: 4181336143, p2p: false, sourceName: 0face48b-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:155 2024-04-17T13:29:13.723Z [JitsiConference.js] <du._addRemoteTracks>:  Adding remote P2P track: RemoteTrack[userID: 0face48b, type: video, ssrc: 1281408767, p2p: true, sourceName: 0face48b-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:155 2024-04-17T13:29:13.732Z [modules/connectivity/TrackStreamingStatus.ts] <new tl>:  RtcMuteTimeout set to: 10000
Logger.js:155 2024-04-17T13:29:13.745Z [JitsiConference.js] <du._suspendMediaTransferForJvbConnection>:  Suspending media transfer over the JVB connection...
Logger.js:155 2024-04-17T13:29:13.745Z [modules/RTC/TPCUtils.js] <yl.setMediaTransferActive>:  TPC[id=2,type=JVB] Suspending media transfer.
Logger.js:155 2024-04-17T13:29:13.745Z [JitsiConference.js] <du._onIceConnectionEstablished>:  Starting remote stats with p2p connection
Logger.js:155 2024-04-17T13:29:13.745Z [modules/statistics/statistics.js] <cr.sendAnalyticsAndLog>:  {"type":"operational","action":"established","source":"p2p","attributes":{"initiator":true}}
Logger.js:155 2024-04-17T13:29:13.763Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.764Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=2,type=JVB] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"maxBitrate":1200000,"networkPriority":"low","priority":"low","scalabilityMode":"L3T3_KEY","scaleResolutionDownBy":1},{"active":false,"adaptivePtime":false,"maxBitrate":0,"networkPriority":"low","priority":"low"},{"active":false,"adaptivePtime":false,"maxBitrate":0,"networkPriority":"low","priority":"low"}]
Logger.js:155 2024-04-17T13:29:13.764Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.764Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"maxBitrate":1200000,"networkPriority":"low","priority":"low","scaleResolutionDownBy":1}]
Logger.js:155 2024-04-17T13:29:13.766Z [modules/xmpp/JingleSessionPC.js] <ya.sendIceCandidate>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] sendIceCandidate: last candidate
Logger.js:155 2024-04-17T13:29:13.772Z [modules/RTC/TPCUtils.js] <yl.setMediaTransferActive>:  TPC[id=2,type=JVB] Suspending media transfer.
Logger.js:155 2024-04-17T13:29:13.772Z [JitsiConference.js] Suspended media transfer over the JVB connection !
Logger.js:155 2024-04-17T13:29:13.835Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=JVB,initiator=false,sid=cadesqmvo7jq3] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.835Z [modules/xmpp/JingleSessionPC.js] <ya.setSenderVideoConstraint>:  JingleSessionPC[session=P2P,initiator=true,sid=a5499c5280dd] setSenderVideoConstraint: 2160, sourceName: 0c594c68-v0
Logger.js:155 2024-04-17T13:29:13.835Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] Setting degradation preference [preference=maintain-framerate,track=LocalTrack[2,video]
Logger.js:155 2024-04-17T13:29:13.836Z [modules/RTC/TraceablePeerConnection.js] <Tl._updateVideoSenderEncodings>:  TPC[id=1,type=P2P] setting max height=2160,encodings=[{"active":true,"adaptivePtime":false,"maxBitrate":1200000,"networkPriority":"low","priority":"low","scaleResolutionDownBy":1}]
BridgeChannel.js:92 WebSocket connection to 'wss://url/colibri-ws/10.244.8.12/e615b440c76e4752/0c594c68?pwd=5oo87muk0ppjcmqpbm1gkruca4' failed: 
_initWebSocket @ BridgeChannel.js:92
Logger.js:155 2024-04-17T13:29:14.201Z [modules/RTC/BridgeChannel.js] <e.onclose>:  Channel closed: 1006 
r @ Logger.js:155
Show 1 more frame
Show less
BridgeChannel.js:92 WebSocket connection to 'wss://url/colibri-ws/10.244.8.12/e615b440c76e4752/0c594c68?pwd=5oo87muk0ppjcmqpbm1gkruca4' failed: 
_initWebSocket @ BridgeChannel.js:92
Logger.js:155 2024-04-17T13:29:16.181Z [modules/RTC/BridgeChannel.js] <e.onclose>:  Channel closed: 1006 
r @ Logger.js:155
Show 1 more frame
Show less
BridgeChannel.js:92 WebSocket connection to 'wss://url/colibri-ws/10.244.8.12/e615b440c76e4752/0c594c68?pwd=5oo87muk0ppjcmqpbm1gkruca4' failed: 
_initWebSocket @ BridgeChannel.js:92
Logger.js:155 2024-04-17T13:29:20.177Z [modules/RTC/BridgeChannel.js] <e.onclose>:  Channel closed: 1006 
spijet commented 5 months ago

Your values seem to be OK, apart from the jvb.octo.enabled: true (which should work, but is not thoroughly tested ATM). To be on the safer side, I'd try to scale JVB back to a signle replica with OCTO disabled and edit the publicIPs list accordingly.

As for that WSS error, code 1006 seems to mean that the local side closed the WSS connection for some reason. Just in case, can you check if you can reach the URL mentioned in the error with wss:// replaced with https:// using your browser while having the bogus room open in another tab? It should give us more information about the problem.

Another possible cause is that Azure Ingress controller needs some special configuration in order to start passing the WSS connections through. I haven't had any experience with AKS yet so I don't know if that is the case, but it can't hurt to check.

Bananenbrot1 commented 5 months ago

Hey @spijet,

I wanted to update you about the Octo configuration—we conducted a load test yesterday evening with 110 users across two video bridges, and it worked flawlessly!

The HTTPS link is accessible, so no issues there. Regarding AKS, I initially thought that might be a problem area, but I haven’t found anything conclusive yet. I'll continue to investigate.

Thanks so much for all your help!

Bananenbrot1 commented 5 months ago

I conducted further testing, and it turns out the 360p issue was due to Jitsi automatically scaling down the video for conferences with more than three users, which makes sense.

@spijet, I also tested with three video bridges and three participants. Each participant was connected to a different bridge, and the conference quality was very good. I then simulated a failure by shutting down two bridges, and all users were seamlessly reassigned to the remaining active bridge. Everything worked perfectly.

I'm going to close this ticket now. Thanks for all the help! If I find a solution for the websocket issue, I'll update this thread.

spijet commented 5 months ago

turns out the 360p issue was due to Jitsi automatically scaling down the video for conferences with more than three users

I haven't heard about it until today. I assume you've tested it in tile mode? If you switch to the default "one big talking head + bunch of smaller ones", the "big" video should be able to scale up to 720p, while the rest will be limited to 180p.

Anyway, glad to hear that OCTO works flawlessly for you! :)