jitsi / jibri

Jitsi BRoadcasting Infrastructure
Apache License 2.0
609 stars 314 forks source link

recording failed to start #116

Open fuqiangleon opened 6 years ago

fuqiangleon commented 6 years ago

@aaronkvanmeerten when i click the [start recording] button in jitsi-meet ,it display [Recording failed to start].could you check the logs and config of my jibri service ?

jibri logs: 2018-05-24 17:24:41.645 INFO: [1] org.jitsi.jibri.Main.main() Using config file /etc/jitsi/jibri/config.json 2018-05-24 17:24:42.210 INFO: [1] org.jitsi.jibri.Main.loadConfig() Parsed config: JibriConfig(recordingDirectory=/tmp/recordings, finalizeRecordingScriptPath=/path/to/finalize_recording.sh, xmppEnvironments=[XmppEnvironmentConfig(name=prod onestarar.cloud, xmppServerHosts=[127.0.0.1], xmppDomain=onestarar.cloud, controlLogin=XmppCredentials(domain=auth.onestarar.cloud, username=jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=conference.onestarar.cloud, roomName=2017, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.onestarar.cloud, username=recorder, password=jibrirecorderpass), stripFromRoomDomain=conference., usageTimeoutMins=0, trustAllXmppCerts=true)]) 2018-05-24 17:24:42.250:INFO::main: Logging initialized @811ms 2018-05-24 17:24:42.279:WARN:oejsh.ContextHandler:main: o.e.j.s.ServletContextHandler@1603cd68{/,null,null} contextPath ends with / 2018-05-24 17:24:42.279:WARN:oejsh.ContextHandler:main: Empty contextPath 2018-05-24 17:24:42.284:INFO:oejs.Server:main: jetty-9.2.z-SNAPSHOT 2018-05-24 17:24:42.573 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http.internal.InternalHttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.internal.InternalHttpApi will be ignored. 2018-05-24 17:24:42.675:INFO:oejsh.ContextHandler:main: Started o.e.j.s.ServletContextHandler@1603cd68{/,null,AVAILABLE} 2018-05-24 17:24:42.684:INFO:oejs.ServerConnector:main: Started ServerConnector@36bef0f1{HTTP/1.1}{0.0.0.0:3333} 2018-05-24 17:24:42.684:INFO:oejs.Server:main: Started @1246ms 2018-05-24 17:24:42.791 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() Connecting to xmpp environment on 127.0.0.1 with config XmppEnvironmentConfig(name=prod onestarar.cloud, xmppServerHosts=[127.0.0.1], xmppDomain=onestarar.cloud, controlLogin=XmppCredentials(domain=auth.onestarar.cloud, username=jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=conference.onestarar.cloud, roomName=2017, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.onestarar.cloud, username=recorder, password=jibrirecorderpass), stripFromRoomDomain=conference., usageTimeoutMins=0, trustAllXmppCerts=true) 2018-05-24 17:24:42.796 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() The trustAllXmppCerts config is enabled for this domain, all XMPP server provided certificates will be accepted 2018-05-24 17:24:42.964 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.connected() Xmpp connection status [auth.onestarar.cloud]: connected 2018-05-24 17:24:42.985 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.authenticated() Xmpp connection status [auth.onestarar.cloud]: authenticated 2018-05-24 17:24:42.999 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.invoke() Jibri reports its status is now idle, publishing presence to connection prod onestarar.cloud 2018-05-24 17:24:43.001:WARN:oejsh.ContextHandler:main: o.e.j.s.ServletContextHandler@53692008{/,null,null} contextPath ends with / 2018-05-24 17:24:43.001:WARN:oejsh.ContextHandler:main: Empty contextPath 2018-05-24 17:24:43.001:INFO:oejs.Server:main: jetty-9.2.z-SNAPSHOT 2018-05-24 17:24:43.046 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http.HttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.HttpApi will be ignored. 2018-05-24 17:24:43.077:INFO:oejsh.ContextHandler:main: Started o.e.j.s.ServletContextHandler@53692008{/,null,AVAILABLE} 2018-05-24 17:24:43.078:INFO:oejs.ServerConnector:main: Started ServerConnector@5f13be1{HTTP/1.1}{0.0.0.0:2222} 2018-05-24 17:24:43.078:INFO:oejs.Server:main: Started @1640ms 2018-05-24 17:25:02.388:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@1271d2f0{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:02.397:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@73129148{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:02.665:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@41c875a3{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:02.930:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@20193bf3{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.096:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@7dd5715a{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.238:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@4f0f93ea{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.363:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@4e5b3914{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.412:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@79eafa38{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.413:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@62693a0f{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.512:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@38ecc928{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.556:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@3ce87b61{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.608:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@4b301ed3{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.701:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@51c3ab65{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.757:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@5b173565{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.759:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@e3299d1{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.834:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@4647bbd0{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.967:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@d9ec0fc{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:03.970:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@cd590d9{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:04.076:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@6d555855{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:04.103:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@238bd04d{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:04.247:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@3f76d26a{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:04.248:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@33ae795e{r=0,c=false,a=IDLE,uri=-} 2018-05-24 17:25:04.274:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@5cc7842e{r=0,c=false,a=IDLE,uri=-}

jibri config below: { // NOTE: this is a SAMPLE config file, it will need to be configured with // values from your environment

// Where recording files should be temporarily stored
"recording_directory":"/tmp/recordings",
// The path to the script which will be run on completed recordings
"finalize_recording_script_path": "/path/to/finalize_recording.sh",
"xmpp_environments": [
    {
        // A friendly name for this environment which can be used
        //  for logging, stats, etc.
        "name": "prod environment",
        // The hosts of the XMPP servers to connect to as part of
        //  this environment
        "xmpp_server_hosts": [
            "onestarar.cloud"
        ],
        // The xmpp domain we'll connect to on the XMPP server
        "xmpp_domain": "onestarar.cloud",
        // Jibri will login to the xmpp server as a privileged user 
        "control_login": {
            // The domain to use for logging in
            "domain": "auth.onestarar.cloud",
            // The credentials for logging in
            "username": "jibri",
            "password": "jibriauthpass"
        },
        // Using the control_login information above, Jibri will join 
        //  a control muc as a means of announcing its availability 
        //  to provide services for a given environment
        "control_muc": {
            "domain": "conference.onestarar.cloud",
            "room_name": "2017",
            "nickname": "jibri-nickname"
        },
        // All participants in a call join a muc so they can exchange
        //  information.  Jibri can be instructed to join a special muc
        //  with credentials to give it special abilities (e.g. not being
        //  displayed to other users like a normal participant)
        "call_login": {
            "domain": "recorder.onestarar.cloud",
            "username": "recorder",
            "password": "jibrirecorderpass"
        },
        // When jibri gets a request to start a service for a room, the room
        //  jid will look like:
        //  roomName@optional.prefixes.subdomain.xmpp_domain
        // We'll build the url for the call by transforming that into:
        //  https://xmpp_domain/subdomain/roomName
        // So if there are any prefixes in the jid (like jitsi meet, which
        //  has its participants join a muc at conference.xmpp_domain) then
        //  list that prefix here so it can be stripped out to generate
        //  the call url correctly
        "room_jid_domain_string_to_strip_from_start": "conference.",
        // The amount of time, in minutes, a service is allowed to continue.
        //  Once a service has been running for this long, it will be
        //  stopped (cleanly).  A value of 0 means an indefinite amount
        //  of time is allowed
        "usage_timeout": "0"
    },
    {
        // Another environment config like the above
    }
]

}

aaronkvanmeerten commented 6 years ago

Try removing the second blank configuration from the config.json:

{
    // Another environment config like the above
}
nightstryke commented 6 years ago

@aaronkvanmeerten He might want to also remove the comma.... , { // Another environment config like the above } @fuqiangleon Change your "prod environment" to something like "onestarar-cloud" Change your "xmpp_server_hosts" to your server ip "192.168.2.1" (example) Your Control MUC should be changed from "domain": "conference.onestarar.cloud", to "domain": "internal.auth.onestarar.cloud",

fuqiangleon commented 6 years ago

@aaronkvanmeerten @nightstryke thanks for your help.

i have updated the config to below:

        // The hosts of the XMPP servers to connect to as part of
        //  this environment
        "xmpp_server_hosts": [
            "onestar.cloud"
        ],
        // The xmpp domain we'll connect to on the XMPP server
        "xmpp_domain": "onestar.cloud",
        // Jibri will login to the xmpp server as a privileged user
        "control_login": {
            // The domain to use for logging in
            "domain": "auth.onestar.cloud",
            // The credentials for logging in
            "username": "jibri",
            "password": "jibriauthpass"
        },
        // Using the control_login information above, Jibri will join
        //  a control muc as a means of announcing its availability
        //  to provide services for a given environment
        "control_muc": {
            "domain": "internal.auth.onestar.cloud",
            "room_name": "2017",
            "nickname": "jibri-nickname"
        },
        // All participants in a call join a muc so they can exchange
        //  information.  Jibri can be instructed to join a special muc
        //  with credentials to give it special abilities (e.g. not being
        //  displayed to other users like a normal participant)
        "call_login": {
            "domain": "recorder.onestar.cloud",
            "username": "recorder",
            "password": "jibrirecorderpass"
        },
        // When jibri gets a request to start a service for a room, the room
        //  jid will look like:
        //  roomName@optional.prefixes.subdomain.xmpp_domain
        // We'll build the url for the call by transforming that into:
        //  https://xmpp_domain/subdomain/roomName
        // So if there are any prefixes in the jid (like jitsi meet, which
        //  has its participants join a muc at conference.xmpp_domain) then
        //  list that prefix here so it can be stripped out to generate
        //  the call url correctly
        "room_jid_domain_string_to_strip_from_start": "conference.",
        // The amount of time, in minutes, a service is allowed to continue.
        //  Once a service has been running for this long, it will be
        //  stopped (cleanly).  A value of 0 means an indefinite amount
        //  of time is allowed
        "usage_timeout": "0"
    }
]

}

but it failed again if i started it with service command.

sudo systemctl status jibri.service ● jibri.service - Jibri Process Loaded: loaded (/etc/systemd/system/jibri.service; disabled; vendor preset: enabled) Active: inactive (dead)

May 25 13:44:18 ubuntu graceful_shutdown.sh[13762]: % Total % Received % Xferd Average Speed Time Time Time Current May 25 13:44:18 ubuntu graceful_shutdown.sh[13762]: Dload Upload Total Spent Left Speed May 25 13:44:18 ubuntu graceful_shutdown.sh[13762]: [149B blob data] May 25 13:44:18 ubuntu systemd[1]: jibri.service: Control process exited, code=exited status=7 May 25 13:44:18 ubuntu systemd[1]: jibri.service: Unit entered failed state. May 25 13:44:18 ubuntu systemd[1]: jibri.service: Failed with result 'exit-code'. May 25 13:44:18 ubuntu systemd[1]: jibri.service: Service hold-off time over, scheduling restart. May 25 13:44:18 ubuntu systemd[1]: Stopped Jibri Process. May 25 13:44:18 ubuntu systemd[1]: jibri.service: Start request repeated too quickly. May 25 13:44:18 ubuntu systemd[1]: Failed to start Jibri Process. ubuntu@ubuntu:/etc/jitsi/jibri$

i have tried to start it with scripts, there are logs below:

ubuntu@ubuntu:/etc/jitsi/jibri$ sudo /opt/jitsi/jibri/launch.sh 2018-05-25 13:45:32.982 INFO: [1] org.jitsi.jibri.Main.main() Using config file /etc/jitsi/jibri/config.json 2018-05-25 13:45:33.561 INFO: [1] org.jitsi.jibri.Main.loadConfig() Parsed config: JibriConfig(recordingDirectory=/tmp/recordings, finalizeRecordingScriptPath=/path/to/finalize_recording.sh, xmppEnvironments=[XmppEnvironmentConfig(name=prod, xmppServerHosts=[onestar.cloud], xmppDomain=onestar.cloud, controlLogin=XmppCredentials(domain=auth.onestar.cloud, username=jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=internal.auth.onestar.cloud, roomName=2017, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.onestar.cloud, username=recorder, password=jibrirecorderpass), stripFromRoomDomain=conference., usageTimeoutMins=0, trustAllXmppCerts=true)]) 2018-05-25 13:45:33.599:INFO::main: Logging initialized @861ms 2018-05-25 13:45:33.626:WARN:oejsh.ContextHandler:main: o.e.j.s.ServletContextHandler@1603cd68{/,null,null} contextPath ends with / 2018-05-25 13:45:33.626:WARN:oejsh.ContextHandler:main: Empty contextPath 2018-05-25 13:45:33.629:INFO:oejs.Server:main: jetty-9.2.z-SNAPSHOT 2018-05-25 13:45:33.910 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http.internal.InternalHttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.internal.InternalHttpApi will be ignored. 2018-05-25 13:45:33.999:INFO:oejsh.ContextHandler:main: Started o.e.j.s.ServletContextHandler@1603cd68{/,null,AVAILABLE} 2018-05-25 13:45:34.007:INFO:oejs.ServerConnector:main: Started ServerConnector@438bad7c{HTTP/1.1}{0.0.0.0:3333} 2018-05-25 13:45:34.008:INFO:oejs.Server:main: Started @1271ms 2018-05-25 13:45:34.101 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() Connecting to xmpp environment on onestar.cloud with config XmppEnvironmentConfig(name=prod, xmppServerHosts=[onestar.cloud], xmppDomain=onestar.cloud, controlLogin=XmppCredentials(domain=auth.onestar.cloud, username=jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=internal.auth.onestar.cloud, roomName=2017, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.onestar.cloud, username=recorder, password=jibrirecorderpass), stripFromRoomDomain=conference., usageTimeoutMins=0, trustAllXmppCerts=true) 2018-05-25 13:45:34.107 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() The trustAllXmppCerts config is enabled for this domain, all XMPP server provided certificates will be accepted 2018-05-25 13:45:34.268 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.connected() Xmpp connection status [auth.onestar.cloud]: connected 2018-05-25 13:45:34.290 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.authenticated() Xmpp connection status [auth.onestar.cloud]: authenticated 2018-05-25 13:45:34.303 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.invoke() Jibri reports its status is now idle, publishing presence to connection prod 2018-05-25 13:45:34.304:WARN:oejsh.ContextHandler:main: o.e.j.s.ServletContextHandler@53692008{/,null,null} contextPath ends with / 2018-05-25 13:45:34.305:WARN:oejsh.ContextHandler:main: Empty contextPath 2018-05-25 13:45:34.305:INFO:oejs.Server:main: jetty-9.2.z-SNAPSHOT 2018-05-25 13:45:34.332 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http.HttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.HttpApi will be ignored. 2018-05-25 13:45:34.356:INFO:oejsh.ContextHandler:main: Started o.e.j.s.ServletContextHandler@53692008{/,null,AVAILABLE} 2018-05-25 13:45:34.356:INFO:oejs.ServerConnector:main: Started ServerConnector@3496655e{HTTP/1.1}{0.0.0.0:2222} 2018-05-25 13:45:34.357:INFO:oejs.Server:main: Started @1620ms 2018-05-25 13:46:05.251:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@332060d5{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.468:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@6d7efcc0{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.577:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@5f7d1e9e{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.630:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@4925f28f{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.646:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@63b91f1f{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.685:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@51bafb01{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.765:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@4971f6fd{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.801:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@678697b0{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:05.819:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@558575ee{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.206:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@4735f6d2{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.207:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@1087f13a{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.211:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@2739945a{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.314:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@12e77853{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.317:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@22841524{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.347:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@21dd014f{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.435:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@74b0410a{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.485:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@38067548{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.503:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@5f893a9f{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.556:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@32b0877f{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.617:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@13ecd4b0{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.657:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@70a80e75{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.700:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@5a5b649d{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.746:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@4084ea6a{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.846:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@1bd8cf7e{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.849:WARN:oejh.HttpParser:qtp466577384-35: badMessage: 400 No URI for HttpChannelOverHttp@7357aa65{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.896:WARN:oejh.HttpParser:qtp466577384-37: badMessage: 400 No URI for HttpChannelOverHttp@42abea6b{r=0,c=false,a=IDLE,uri=-} 2018-05-25 13:46:06.938:WARN:oejh.HttpParser:qtp466577384-36: badMessage: 400 No URI for HttpChannelOverHttp@52466233{r=0,c=false,a=IDLE,uri=-}

could you check it again ?is there any mistake over there ?

bbaldino commented 6 years ago

Do you have an http client sending requests to Jibri?

https://stackoverflow.com/questions/26898691/jetty-badmessage-400-no-host-for-httpchanneloverhttp

fuqiangleon commented 6 years ago

@bbaldino sorry. i don't understand, how do i test it ?

bbaldino commented 6 years ago

Nothing to test, I'm asking if there is something sending requests to Jibri's http api...the link I found with that error message talks about incorrect http requests causing jetty to print that message.

fuqiangleon commented 6 years ago

got it. @bbaldino other hand.what can i do next ? ;)

fuqiangleon commented 6 years ago

@nightstryke @aaronkvanmeerten sir, any other ideas ?there are some errors in console of chrome when i click button [start recording]:

Stream ID is empty or undefined
vincentwlau commented 6 years ago

@bbaldino I was using Prosody as XMPP server and Jibri was working fine. Then I switched to use Openfire with an updated the config.json in Jibri and sip-communicator.properties in jicofo. The web client got "Recording unavailable", but the Jibri log seemed indicating that everything was fine:

2018-07-12 01:22:16.524 INFO: [1] org.jitsi.jibri.Main.main() Using config file /etc/jitsi/jibri/config.json 2018-07-12 01:22:17.752 INFO: [1] org.jitsi.jibri.Main.loadConfig() Parsed config: JibriConfig(recordingDirectory=/opt/jibri/recordings, finalizeRecordingScriptPath=/opt/jibri/uploader.sh, internal_http_api_port=3333, external_http_api_port=2222, xmppEnvironments=[XmppEnvironmentConfig(name=POC, xmppServerHosts=[jitsi-w2], xmppDomain=jitsi2.magnet.com, controlLogin =XmppCredentials(domain=jitsi2.magnet.com, username=jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=jibricontrolmuc.jitsi2.magnet.com, roomName=JibriBrewery, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.jitsi2.magnet.com, username= recorder, password=jibrirecorderpass), stripFromRoomDomain=conference., usageTimeoutMins=0, trustAllXmppCerts=true)]) 2018-07-12 01:22:18.726 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http .internal.InternalHttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.internal.InternalHttpApi will be ignored. 2018-07-12 01:22:19.399 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() Connecting to xmpp environment on jitsi-w2 with config XmppEnvironmentConfig(name=POC, xmppServerHosts=[jitsi-w2], xmppDomain=jitsi2.magnet.com, controlLogin=XmppCredentials(domain=jitsi2.magnet.com, username= jibri, password=jibriauthpass), controlMuc=XmppMuc(domain=jibricontrolmuc.jitsi2.magnet.com, roomName=JibriBrewery, nickname=jibri-nickname), sipControlMuc=null, callLogin=XmppCredentials(domain=recorder.jitsi2.magnet.com, username=recorder, password=jibrirecorderpass), stripFromRoom Domain=conference., usageTimeoutMins=0, trustAllXmppCerts=true) 2018-07-12 01:22:19.415 INFO: [1] org.jitsi.jibri.api.xmpp.XmppApi.start() The trustAllXmppCerts config is enabled for this domain, all XMPP server provided certificates will be accepted 2018-07-12 01:22:19.855 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.connected() Xmpp connection status [jitsi2.magnet.com]: connected 2018-07-12 01:22:20.039 INFO: [1] class org.jitsi.xmpp.mucclient.MucClient.authenticated() Xmpp connection status [jitsi2.magnet.com]: authenticated 2018-07-12 01:22:20.235 WARNING: [1] org.glassfish.jersey.internal.inject.Providers.checkProviderRuntime() A provider org.jitsi.jibri.api.http .HttpApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.jitsi.jibri.api.http.HttpApi will be ignored.

The Openfire Admin Console showed that "jibri" user was online, a room "JibriBrewery" was created, and the room had "jibri" as a participant and "focus" as a moderator. But the web client did not have the recording state initialized which might imply that the web client didn't receive some events. Are there any particular log messages that indicate Jibri is fully connected and Jicofo is fully aware of the Jibri instance? Also, Openfire does not support virtual host, is there a workaround for the hidden domain "recorder.{mydomain}" for the "recorder" user?

bbaldino commented 6 years ago

I don't have any experience with openfire, but Jibri uses presence messages to post updates about its availability. Jicofo listens to those and then sends update to the web, so one thing to check would be to the Jicofo logs to see if it's getting status updates from Jibri.

vincentwlau commented 6 years ago

@fuqiangleon The "Stream ID is empty and undefined" means that your Jibri was not compatible with your Jicofo. If your jicofo version is 1.0-405-1 (dated 2018-05-01), you have to use the Jibri built from commit# e36a8407870b79e4bc9a44b94de19f027a94b2c1. Use "dpkg -l | grep jicofo" to get the jicofo version. To build your Jibri:

git clone https://github.com/jitsi/jibri.git cd jibri git checkout e36a8407870b79e4bc9a44b94de19f027a94b2c1 mvn clean package mv target/jibri-1.0-SNAPSHOT-jar-with-dependencies.jar jibri.jar

Replace the original /opt/jitsi/jibri/jibri.jar with your jibri.jar.

vincentwlau commented 6 years ago

@bbaldino Thanks for the information.

vincentwlau commented 6 years ago

@bbaldino I ran into a problem with Jibri and Openfire after a user starts a conference. Ater Jibri posted a presence message of its availability to Jicofo, Jicofo sent a presence packet

<presence to='myroom@conference.mydomain/focus' from='focus@mydomain/focus1234343'...>
...<jibri-recording-status ... status='off'/>
</presence>

However, Openfire does not seem to route this presence packet to anywhere. I couldn't find a definitive answer how MUC should handle a presence packet with a full JID (i.e. myroom@conference.mydomain/focus.) Should this presence packet be sent directly to the web client through MUC, or should the "focus" user in "myroom" receive this packet first (ChatRoomImpl class) and then updates the web client? I try to figure out if it is a configuration issue in Openfire or there is a bug in somewhere.

bbaldino commented 6 years ago

Jibri posts a presence to the control muc (which only jibri and jicofo are in), which Jicofo receives. Jicofo then parses bits out of that presence and puts them in its presence in the call muc (which jicofo and all the call participants are in).

vincentwlau commented 6 years ago

@bbaldino That's interesting. At the very beginning of a call, after Jicofo received the presence from control muc, Jicofo calls JibriRecorder.init() which sends the extended presence with element to the call muc with "to=myroom@conference.mydomain/focus". Should it be sent to "myroom@conference.mydomain" (bare JID) instead? If the recipient is meant to the be the entire room (bared JID), Openfire may handle "myroom@conference.mydomain/focus" differently from Prosody. Can you confirm if the recipient should be a full JID or bare JID? Thank you for your insight.

bbaldino commented 6 years ago

yeah that does seem odd...i'm afraid i'm not much of an xmpp authority though. @damencho any thoughts?

vincentwlau commented 6 years ago

@bbaldino I was wrong; the full JID in the presence packet is the correct target (the "/focus" is actually the nickname of the sender.) It looks that Openfire dropped the extension presence packet and resent the last presence packet.

damencho commented 6 years ago

Maybe @guusdk can give more information for Openfire? :)

vincentwlau commented 6 years ago

I have posted this issue to the Openfire community. Thanks for the referral.

guusdk commented 6 years ago

Said issue was posted (and discussed) here: https://discourse.igniterealtime.org/t/muc-ignored-an-update-of-presence-but-resent-an-old-presence/82265/3

If I understand things correctly (haven't tried to reproduce this), then the client is sending two "join" presence stanzas. Openfire, unlike Prosody, assumes that the client somehow forgot that it already joined (this happens in the field). I'm looking into facilitating this better in Openfire, but suspect that a more structural fix would be for the client to not include in presence stanzas that are not the initial join.

fuqiangleon commented 6 years ago

@vincentwlau thank you so much for your help . have a nice day.