supabase / auth

A JWT based API for managing users and issuing JWT tokens
https://supabase.com/docs/guides/auth
MIT License
1.55k stars 375 forks source link

Magic Links are invalidated by corporate link scanning software #1214

Open kav opened 1 year ago

kav commented 1 year ago

Bug report

Describe the bug

We have a customer using the Barracuda SafeLinks platform, and apparently Office 365 offers something similar, which scans all links before opening them via a wrapper link it puts on all incoming email. This invalidates Magic Links as the link scanning platform triggers expiration when the user opens the link.

Next Auth has a similar scenario and seems to recommend responding to HEAD requests with a 200 without invalidating the link as a fix. See this https://next-auth.js.org/tutorials/avoid-corporate-link-checking-email-provider for more.

To Reproduce

This is a bit difficult as you likely have to configure MS Defender on Office 365 at the least to test a relevant scenario.

Expected behavior

Ideally the user doesn't see a link expired error and is able to use the link successfully.

Additional context

For those with access to IT security policy the organization can disable scanning of supabase links via https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/safe-links-about?view=o365-worldwide#do-not-rewrite-the-following-urls-lists-in-safe-links-policies for the MS platform at least

activenode commented 1 year ago

That's not a bug. I do find the solution helpful, but it still does not make this a bug IMO but a Feature Request. The given solution however would still not solve other Crawlers that do visit the page without just a HEAD request.

In either case you can just build your own Magic Link system to solve it. Via generateLink you get the token you need to call verifyOTP and hence log in the user, so there's no specific need to use the Supabase links.

tttp commented 1 year ago

Hi,

HTTP safe calls (get/head)should be idempotent, ie. the effect on the server (login in) of a single request is the same as the effect of making several identical requests.

Arguably, a "single use" link (http get) is a bug, it should be implemented by requiring a second action (eg a click or js) on the page of that url.

By allowing multiple http head calls, this could be seen as fixing a bug (http head would be indempotent) and solve the problem for users of office 365

Would you see any potential issues to changing the way head is handled? it feels safe, but I might be missing something.

I'm aware it would "only" be fixing office 365 crawler and not others, but given the market share, it would solve the pain for a lot of users AND the odds of other crawlers following how 365 works (head not get) are rather high.

ElvinKyungu commented 2 months ago

I also have the same problem, once solved I will not hesitate to share the solution.

kav commented 2 months ago

We switched to email codes and a prefilled form when you click the link. It's a bit less convenient for users as they have to submit the page they land on but since it requires the one extra step it doesn't break with link scanners. Link scanners are becoming more prevalent with our users at at least so this is likely to become a bigger problem for folks.

tttp commented 2 months ago

Hi @kav,

we did almost the same, but with a bit of javascript to submit (click on the button) automatically and it seems to work fine so far, none of the bots are running js.

Last time I checked, the bots seems to do a HEAD not GET call, so that would be another reason they do not automatically click on the button.

I still hope that the magic links will discard the HEAD commands instead of processing them as a GET and invalidate the link, but the email code + javascript seems to be a fine replacement