Closed Soviut closed 1 year ago
I set up my initial data in the subscription status handler. So I have a running subscription then fetch the initial data. Very simplified...
.subscribe((status) => {
if (status === "SUBSCRIBED") {
realtimeInitTable()
}
})
@GaryAustin1 I tried the following
.subscribe((status) => {
console.log(status)
})
Which immediately printed SUBSCRIBED
. However, I was still able to submit my post before the subscription seemed to be picking up changes.
I even tried moving my fetchPosts()
call inside the callback, but I can still race the subscription.
.subscribe((status) => {
fetchPosts()
})
I can confirm what appears to be a bug. I'll generate a realtime issue with traces, but realtime-js issues subscribe success well before realtime comes back with a subscribed message in the websocket. It can take 2 to 3 seconds. I don't have an issue with the 2 to 3 seconds, although that seems bad, but the fact the .subscribe() handler is saying subscribed well before.
@w3b6x9 In order not to miss updates I have used the .subscribe() callback message "SUBSCRIBED" to load the initial table data being monitored.
That is clearly not working now, and may never have actually reflected the realtime data was ready to be reported, but I have no way to check now.
I'm doing something like this for the moment
mySubscription = SBclient.channel(......).on(.......).subscribe (//error handler)
mySubscription.socket.onMessage(message => {
console.log('MESSAGE', message)
if (message.event === 'system' && message.payload.message === 'subscribed to realtime') {
console.log('OK now we are really subscribed')
loadInitialTable()
}
})
Don't like testing for undocumented messages, but does this seem reasonable for now?
Hopefully the .subscribe() method will have a state that says realtime update tracking is ready as part of looking at this issue.
@GaryAustin1 awaiting the subscribe() should also suffice; it doesn't have to be a callback. However, in both cases, the async behaviour is finishing prematurely.
Hey everyone, sorry for taking so long to address this.
For the new Multi-Tenant Realtime cluster that offers Broadcast and Presence functionalities, we wanted clients to immediately connect to the cluster and have Postgres changes prepare asynchronously since that requires more time inserting into your realtime.subscription
database table. Every single Realtime channel can do one thing or everything, like Broadcast/Presence/Postgres changes, so we didn't want Postgres changes to hold up the rest.
However, I do see the frustration of not being able to know when exactly Postgres changes is ready client-side since it's no longer when initially connected to the channel.
Let me discuss this further with my team and then we can continue the conversation here.
I'm doing something like this for the moment
mySubscription = SBclient.channel(......).on(.......).subscribe (//error handler) mySubscription.socket.onMessage(message => { console.log('MESSAGE', message) if (message.event === 'system' && message.payload.message === 'subscribed to realtime') { console.log('OK now we are really subscribed') loadInitialTable() } })
Don't like testing for undocumented messages, but does this seem reasonable for now?
@GaryAustin1 That's reasonable for now but I wouldn't depend on that beyond the short-term.
@w3b6x9 Has this been fixed? Or is there a solution to solving this problem? How can my front end know when realtime subscription is ready to send updates? Is there any further documentation to know about? It seems like before what ever update recently happened to realtime, subscriptions were instant... instead of taking 2-3 seconds to initialize.
@EpicDevv we're still discussing internally about how to handle this but for the time being you can do what @GaryAustin1 suggested:
mySubscription = SBclient.channel(......).on(.......).subscribe (//error handler)
mySubscription.socket.onMessage(message => {
console.log('MESSAGE', message)
if (message.event === 'system' && message.payload.message === 'subscribed to realtime') {
console.log('OK now we are really subscribed')
loadInitialTable()
}
})
If anyone is using @GaryAustin1 's workaround, the message.payload.messsage is now Subscribed to Realtime
so the above should be
mySubscription = SBclient.channel(......).on(.......).subscribe (//error handler)
mySubscription.socket.onMessage(message => {
console.log('MESSAGE', message)
if (message.event === 'system' && message.payload.message === 'Subscribed to Realtime') {
console.log('OK now we are really subscribed')
loadInitialTable()
}
})
Looks like the message.payload.messsage changed again, this time to Subscribed to PostgreSQL
so the example should be
mySubscription = SBclient.channel(......).on(.......).subscribe (//error handler)
mySubscription.socket.onMessage(message => {
console.log('MESSAGE', message)
if (message.event === 'system' && message.payload.message === 'Subscribed to PostgreSQL') {
console.log('OK now we are really subscribed')
loadInitialTable()
}
})
FWIW I have implemented a 3 second delay instead of checking for a particular message as it appears to be undergoing rapid changes.
If anyone ends up reading these comments in the future please check and verify what the correct message.payload.messsage is as I will have stopped checking :)
@cbetz Sigh on name change again with no notice.
3 seconds is not enough....
4.4 seconds and 3.2 in 2 runs sample.
This new approach of the realtime server with 4 or more second delay to a real subscription really puts a crimp in my approach to using realtime with getting data from the monitored table on initialization complete.
You would like to be able to turn on realtime and use the really connected signal to then load the existing table data you need (like last 10 chat room messages) to populate your screen. Then updates from realtime show as needed.
This delay now means the screen will sit in loading mode for many seconds before old messages show up.
You are almost forced to load the initial data with a select, start realtime and then do another select to see if any data changed during the 4 seconds realtime was setting up.
@cbetz Sigh on name change again with no notice. 3 seconds is not enough....
4.4 seconds and 3.2 in 2 runs sample.
Thank you for the heads up!
Hey all ... I've been changing the system messages. Sorry! I didn't realize people were matching on that and assumed if you were you'd match on the status
.
It's in an effort to make them an official thing, so we can make it much more obvious to clients what's actually happening on the backend.
It really should not take a few seconds to subscribe to a healthy database. We'll get some observability around this and figure out what's going on.
Update: this should be much better in 99% of connections! Just pushed to prod.
If zero users are listening for db changes it will take a couple seconds, as the connection information needs to propagate across our global cluster.
But if we already have a connection somewhere globally, your client should be able to start getting changes within a few hundred milliseconds of joining:
This is obviously a big deal though and we will do better. Making these system messages officially supported in the client is a priority. Making the client be "ready" based on that by default is important. And we'll work on making the first connection faster also.
Thanks for all of this context, @chasers – we're running into a situation where the subscription isn't truly returning events until the second message containing the message "Subscribed to PostgreSQL" comes in. This is our approach to managing status:
import { useSessionContext } from "@supabase/auth-helpers-react"
import { useEffect, useState } from 'react'
import { devConsoleLog } from '../errorHandlerNew';
import { useStateRef } from "./useStateRef";
const defaultCB = (payload) => {
devConsoleLog(['Received realtime event', payload]);
}
export function useSubscription(
callback = defaultCB,
event = "*",
table = "*",
filter = null
) {
const { supabaseClient } = useSessionContext();
const [isReady, setIsReady] = useState(false)
// note: magic string, subject to breakage by updating
// see https://github.com/supabase/realtime/issues/282
const correctMessage = "Subscribed to PostgreSQL"
const [currentCorrectMessageCount, setCurrentCorrectMessageCount, correctMessageCountRef] = useStateRef(0)
useEffect(function onCorrectCount() {
if (currentCorrectMessageCount >= 2) {
setIsReady(true)
}
}, [currentCorrectMessageCount])
const onMessageReceived = (e) => {
let parsed = JSON.parse(e?.data)
if (parsed?.payload?.message == correctMessage) {
setCurrentCorrectMessageCount(correctMessageCountRef.current + 1)
}
}
useEffect(() => {
const sub = supabaseClient
.channel(table === "*" ? "public" : `public:${table}`)
.on('postgres_changes', {
schema: 'public',
table,
event,
filter
}, (payload) => {
callback(payload)
})
.subscribe()
sub.socket.conn.addEventListener('message', onMessageReceived)
return () => {
sub.unsubscribe();
}
}, []);
return {isReady}
}
Just wanted to drop this in to (1) hopefully get feedback on any better approaches, (2) help anyone else running into this when working in supabase-js
and auth-helpers-react
, and (3) ask if there's any progress on speeding up that first connection 😄
Really appreciate all that the Supabase team is doing!
we're running into a situation where the subscription isn't truly returning events until the second message containing the message "Subscribed to PostgreSQL" comes in
Yes, this is by design now because there are other services we're offering through Supabase Realtime. We want the websocket to connect so it can send/receive messages to other things other than the db.
With the latest realtime-js
2.1.0
you can now do:
channel.on('system', {}, payload => {
if (payload.extension === 'postgres_changes' && payload.status === 'ok') {
console.log('Do something crazy!');
}
})
The system
messages now have an extension
key so you can match on that and the status
vs a message
which we'd like to potentially change at will to make them more informative as we get feedback.
Note: we'll be pushing this update to the cluster likely Monday or Tuesday next week, and will update here when that happens. It seems this shouldn't break anything for people in this thread at least. We'll also be publishing some docs on this shortly.
Also:
Just want to confirm this is normal now... Getting the postgres_changes on every heartbeat, plus the two up front.
@GaryAustin1 hmm, I don't think it should be doing that. Can you check your Realtime logs or dm me your project id?
Actually, double checked that's all fine, although it feels like it should really send just one when it connects.
@chasers we've been using the hook I posted for a couple weeks now and all seems fine (other than this local issue but that's unrelated to the delay). The realtime inspector has been helpful, thanks for sharing that.
We'll update our hook to use payload.extension
and payload.status
soon and will report back.
I'm made a few changes a while back that should have improved this quite a bit. Closing for now. How has this been for you all?
.on('system', {}, payload => { if (payload.extension === 'postgres_changes' && payload.status === 'ok') { console.log('Do something crazy!'); } })
Thanks for the updates @chasers, sorry to revive this, but I still think that many applications currently are assuming that status === 'SUBSCRIBED'
inside the .subscribe
callback of realtime-js
can be used to refetch data for a valid state – which is wrong, because it's necessary to instead listen to the system message, as you described.
Maybe I missed something inside the docs, but it would be probably a good idea to give a full blown example there. With the system message listener approach, there are also currently a few hurdles:
event: system
extension: postgres_changes
message is fired twice on the initial subscription => It's necessary for each dev to implement something to skip the first event (like in the code block from @jchekanoff)event: system
extension: postgres_changes
message is fired every heartbeat => This seems unexpected, as I was never unsubscribed. To correctly implement connection recovery, I therefore not only need to listen to this event, but also check if another status event for the disconnect was fired beforeIf there's another issue to read-up on this or maybe even some docs floating around somewhere, I'd be happy to be redirected and leave this be 👀 Thanks for all the hard work, I really enjoy using supabase with realtime!
This implementation is working fine for me and handles both disconnects (due to network / or tab beeing in the background) and also handles the initial subscription.
It would be very nice if we could add the system
type to the TypeScript definitions of realtime, so that we don't have to cast everything as any. This was written on supabase-js 2.26.0
and supabase-cli 1.75.3
export function syncDbChanges() {
let initialMessageSkipped = false;
let previouslyDisconnected = true;
supabase
.channel('*')
.on('postgres_changes', { event: '*', schema: '*' }, (payload) => {
// Invalidate specific entity / table based on payload
})
.on('system' as any, {} as any, (payload: any) => {
if (payload.extension !== 'postgres_changes' || payload.status !== 'ok') {
return;
}
if (!initialMessageSkipped) {
initialMessageSkipped = true;
return;
}
if (previouslyDisconnected) {
// Invalidate all data because we could not track changes while we weren't connected
previouslyDisconnected = false;
}
})
.subscribe((status) => {
if (status === 'SUBSCRIBED') {
previouslyDisconnected = true;
}
});
}
@NiklasPor
The event:
system
extension:postgres_changes
message is fired every heartbeat
This is fixed actually. The re-subscribe is triggered with realtime.setAuth
on the client because after the heart beat the client sends the "access_token" message with the current JWT the client has. As of a couple weeks ago if the JWTs are different, it will resubscribe, otherwise it doesn't do anything.
JFYI, Server-side we do shutdown the channel when the JWT expires.
event:
system
extension:postgres_changes
message is fired twice on the initial subscription
Hmm, I'm not seeing this. If there are some connection details you can share privately via support that could help? Or a repo which replicates this behavior?
It would be very nice if we could add the
system
type to the TypeScript definitions of realtime
@w3b6x9 is going to add this!
I found this issue while searching the reason I get:
tried to push 'presence' to 'realtime:room:7da34ed8-649b-4107-9b73-3b1ddb118daf' before joining. Use channel.subscribe() before pushing events
after the status SUBSCRIBED
is received. I'm guessing I need to add the listeners for the system messages for this to work properly?
Bug report
Describe the bug
I'm experimenting with building a very simple realtime chat application in Vue 3. I'm using a subscription to monitor my
posts
table onINSERT
to keep the chat log up to date. However, if I submit my send form immediately after a browser refresh, the subscription doesn't register the insert even though I can confirm the post was inserted. If I wait a few seconds after refresh, then submit a post, the subscription registers the insert.There seems to be some kind of initialization time/delay for the subscription to become active.
To Reproduce
Steps to reproduce the behavior, please provide code snippets or a repository:
Expected behavior
I expect the subscription to either initialize more quickly or provide some kind of callback that can let me know when it's ready so I can lock my form until then.
Screenshots
If applicable, add screenshots to help explain your problem.
System information
Additional context
Add any other context about the problem here.