Closed radmanz closed 5 years ago
How did you resolve this @radmanz?
Hey @stancl I have found that this second request that is being created (as described above) should have been making a request to the web socket service (php artisan websockets:serve).
To get everything working on a tenant level it should be noted that
connections.pusher.options.host
and .port
settings must be set to the websockets server spawned by artisan, in my case it was 127.0.0.1:6001. If you are running a single instance of the ws server via kuberneties or similar, you can use one server for multiple tenants if configured correctly.connections.pusher.options.secret
, .key
, and .app_id
are unique to the tenant also by setting these values within your broadcasting code. I stored these values within the tenant custom properties. This will make sure that all messages sent from the tenants laravel instance to the websocket service will contain the approprate broadcast destination (tenant users). Hello, not sure if I should re-open this issue or create a new one but I am having a bit of trouble getting this broadcasting with pusher working on tenants.
I have the central app pusher config set via ENV on the deployment config and with proper entries in the config file:
'connections' => [
'pusher' => [
'driver' => 'pusher',
'key' => env('PUSHER_APP_KEY'),
'secret' => env('PUSHER_APP_SECRET'),
'app_id' => env('PUSHER_APP_ID'),
'options' => [
'cluster' => env('PUSHER_APP_CLUSTER'),
'useTLS' => true,
],
],
I am using Laravel Echo and I am sure that the correct values are being setup:
Echo = new window.LaravelEcho({
broadcaster: 'pusher',
key: '{{config('broadcasting.connections.pusher.key')}}',
cluster: '{{config('broadcasting.connections.pusher.options.cluster')}}',
forceTLS: true
});
First issue was being redirected to /app when the pusher client tried going to /broadcast/auth, I solved that by setting the middleware to the Broadcast::routes call in my service provider:
Broadcast::routes(['middleware' => ['tenancy','web']]);
That seems to resolve the authentication issue for any tenants using the default pusher config.
However each tenant should have their own pusher config, so following the above I setup the storage_to_config_map as such:
'storage_to_config_map' => [ // Used by the TenantConfig feature
'app_name'=>'app.name',
'pusher_key'=>'broadcasting.connections.pusher.key',
'pusher_secret'=>'broadcasting.connections.pusher.secret',
'pusher_app_id'=>'broadcasting.connections.pusher.app_id',
'pusher_cluster'=>'broadcasting.connections.pusher.options.cluster',
],
I have two 'tenants':
[Tenant] id: 9653e0fd-3147-465c-98c4-bfa7e12ace8d @ bar.localhost
[Tenant] id: fd20fafc-ef76-48f3-b086-21ccd7c19030 @ foo.localhost
and then set the values via (for testing):
tenancy()->findByDomain('foo.localhost')->put(['pusher_key'=>'xxx','pusher_secret'=>'xxx','pusher_app_id'=>'xxx','pusher_cluster'=>'us3']);
which is (as I understand) set correctly:
>>> tenancy()->findByDomain('foo.localhost');
=> Stancl\Tenancy\Tenant {#3576
+data: [
"id" => "fd20fafc-ef76-48f3-b086-21ccd7c19030",
"plan" => "free",
"is_trial" => 0,
"trial_start" => null,
"owner" => 0,
"pusher_key" => "xxx",
"pusher_app_id" => "xxx",
"pusher_secret" => "xxx",
"pusher_cluster" => "us3",
],
+domains: [
"foo.localhost",
],
+persisted: true,
}
>>>
First issue: I have a command to test sending the notification, when I call it via:
#send notification to user#1 on foo.localhost
php artisan tenant:run user:test-notification --tenants=fd20fafc-ef76-48f3-b086-21ccd7c19030 --argument="user_id=1" --argument="via=broadcast"
The notification is being sent via the default (central) pusher config and the browser window on bar.localhost
gets the notification.
Second, The Echo pusher client connection seems to open properly to the correct cluster using the correct key, but I am having issues with the client authenticating (for private channels). It seems /broadcast/auth is returning the incorrect key in its response.
{"auth":"app-key-from-central-config:xxxx"}
I am not sure if this is an issue with this package, or with the way Broadcasting is implemented, I did some digging into the BroadcastManager code and the Pusher Broadcast driver, it seems perhaps they're loading the config in the service provider once and not after the middleware has switched to the tenant environment.
Hi,
Regarding the first issue. Could you try making it an explicitly tenant-aware command?
Regarding the second issue, I can't really help much because I've never used broadcasting, so it would take me a long dig through that code, but it does seem like the config is loaded once, before tenancy is initialized (most likely a dependency is injected when the app is still in the central environment). It might be useful to look at the BroadcastingServiceProvider (https://github.com/laravel/framework/blob/6.x/src/Illuminate/Broadcasting/BroadcastServiceProvider.php#L24) and see what things are persisted, like the $drivers
property on BroadcastManager
.
Hi @stancl I've implemented the command as a tenant-aware command and it seems the issue still persists.
I think the issue is going to be in the way the BroadcastServiceProvider persists the config. Not sure how to move forward or how feasible it would be to make something work here.
@wammy21 did you solve this issue?
@wammy21 did you solve this issue?
I'm still having the same problem.
Tenancy v4 has a dedicated bootstrapper for broadcasting tenancy.
I am currenly blocked on how to resolve a Pusher broadcast exception i am receiving. We use laravel-websockets to push messages to users. When creating a broadcast event an exception TenantCouldNotBeIdentifiedException is raised (see below). It's almost as if the broadcast event is executed and a request is is generated internally, causing the tenant domain to be lost, hence reporting 127.0.0.1 as the tenant? Does any one have any ideas as to how this can be resolved? Many thanks in advance 😎
The stack trace is as follows.