supabase / auth

A JWT based API for managing users and issuing JWT tokens
https://supabase.com/docs/guides/auth
MIT License
1.38k stars 338 forks source link

Magic Links are invalidated by corporate link scanning software #1214

Open kav opened 1 year ago

kav commented 1 year ago

Bug report

Describe the bug

We have a customer using the Barracuda SafeLinks platform, and apparently Office 365 offers something similar, which scans all links before opening them via a wrapper link it puts on all incoming email. This invalidates Magic Links as the link scanning platform triggers expiration when the user opens the link.

Next Auth has a similar scenario and seems to recommend responding to HEAD requests with a 200 without invalidating the link as a fix. See this https://next-auth.js.org/tutorials/avoid-corporate-link-checking-email-provider for more.

To Reproduce

This is a bit difficult as you likely have to configure MS Defender on Office 365 at the least to test a relevant scenario.

Expected behavior

Ideally the user doesn't see a link expired error and is able to use the link successfully.

Additional context

For those with access to IT security policy the organization can disable scanning of supabase links via https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/safe-links-about?view=o365-worldwide#do-not-rewrite-the-following-urls-lists-in-safe-links-policies for the MS platform at least

activenode commented 10 months ago

That's not a bug. I do find the solution helpful, but it still does not make this a bug IMO but a Feature Request. The given solution however would still not solve other Crawlers that do visit the page without just a HEAD request.

In either case you can just build your own Magic Link system to solve it. Via generateLink you get the token you need to call verifyOTP and hence log in the user, so there's no specific need to use the Supabase links.

tttp commented 9 months ago

Hi,

HTTP safe calls (get/head)should be idempotent, ie. the effect on the server (login in) of a single request is the same as the effect of making several identical requests.

Arguably, a "single use" link (http get) is a bug, it should be implemented by requiring a second action (eg a click or js) on the page of that url.

By allowing multiple http head calls, this could be seen as fixing a bug (http head would be indempotent) and solve the problem for users of office 365

Would you see any potential issues to changing the way head is handled? it feels safe, but I might be missing something.

I'm aware it would "only" be fixing office 365 crawler and not others, but given the market share, it would solve the pain for a lot of users AND the odds of other crawlers following how 365 works (head not get) are rather high.