Open haasn opened 8 years ago
Also, usually when there's basic auth, other assets like .js
or .css
files are are also behind it, or in my case the web page does RPC calls to the backend, and there's no way to enter the authentication data there; qutebrowser
is completely unusable.
I'd argue that this is not a low priority task in this case, because it makes most of the websites with basic auth unusable.
the web page does RPC calls to the backend, and there's no way to enter the authentication data
What kind of RPC? I tested with an XHR GET request for an asset both when the entire domain required basic auth and just when the directory the asset was in required auth and I was prompted both times. Can you check in your web console (bound to wi
by default) and see if there are error messages that look like
Failed to load https://some.other.domain/blah/blah/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://primary.domain' is therefore not allowed access. The response had HTTP status code 401.`
Because in the case of a CORS violation the chromium engine doesn't bother asking for credentials.
Another workaround is to use ~/.netrc file to provide the credentials there. So you wont be prompted anymore for the sites you specified there.
Edit: it seems like ~/.netrc is not used if it is a symlink
@toofar So, here is my case:
home.utdemir.com
is protected with basic auth/
there's a page with some images and links, these load successfully after auth401 Unauthorized
from the server, without asking for anything. Also, to make things more complicated; I think the link which doesn't work is using soft navigation: https://github.com/pldubouilh/gossa/blob/master/src/script.js#L31
Both Chromium and Firefox works in this case.
Is there any chance you can make a minimal reproducer with nginx? Like how are you enforcing authentication and how are you reverse proxying back to gossa.
Hey @toofar ,
I set up a subdomain reproducing the issue, it isn't really minimal but you can still look around. I'll keep it around until you finish investigating.
url: test.utdemir.com
user: test
pass: qutebrowserrocks
Here is the relevant server
block on nginx
:
server {
listen 80;
server_name test.utdemir.com;
location / {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_pass http://localhost:8002;
auth_basic "Speak Friend and Enter";
auth_basic_user_file /path/to/htpasswd;
proxy_connect_timeout 300;
proxy_send_timeout 300;
proxy_read_timeout 300;
send_timeout 300;
}
}
And gossa
is running as gossa -h 127.0.0.1 -p 8002 /home/test
.
Thank you very much for spending time on this!
This appears to be simply because that application is using fetch()
. Switching that to use XMLHttpRequest works fine.
This page describing the fetch()
API says
By default, fetch won't send or receive any cookies from the server, resulting in unauthenticated requests if the site relies on maintaining a user session (to send cookies, the credentials init option must be set).
It looks like someone decided to treat basic auth the same as cookies when they first implemented it.
Changing that fetch call to be fetch(href, {credentials: 'include'})
(from here) makes it work as well. I tested it with chromium 67 and it didn't work (as is) but with google-chrome 70 it did work. I suspect you (@utdemir) are on Qt 5.11 because that is based on chromium 65. Qt 5.12 is based on 69 and it seems to work just fine there. So it looks like it'll fix when you upgrade (or you can just compile that program yourself with that credentials arg added).
I'll mark this exchange as offtopic now because it turns out to be a different issue.
@toofar Thank you very much for your response. Got it, since my distribution does not have Qt 5.12 yet, I'll patch the application.
Thanks again, have a great day.
If I connect to some website that requires authentication (.htaccess style) in most browsers, it will remember that password for the “session”, i.e. if I click on a link from within this tab it won't prompt me a second time.
In qutebrowser, it will ask again for the password for every single link I open, which is rather bothersome on some websites.
There is some concern about the meaning of a “session” (since it shouldn't be too long-lived otherwise it would defeat the purpose of a session for users who like to keep their browsers running for weeks on end), so I think a good way to approach it would be something like this:
Some obvious security caveats apply (e.g. the authentication should only ever be re-entered for websites on the same full domain), but it would still be worth it IMO.