new Request() expects actual headers, so Authorization: Basic ###, but the code above is formatted for client options.
Using $response = $this->client->request( 'GET', $url, $headers ); works because the expected 3rd argument is $options and the format of the $headers is ['auth' => ['user', 'pass']].
In my local copy I was able to get it working by making those changes and also changing DetectSitemapsURLs::detect() to pass in the guzzle options to SitemapParser.
These changes seemed to work fine on one site, but on another site the plugin kept failing saying wp-sitemap.xml was not found. So i didn't feel comfortable submitting a pull request, but I hope this helps.
When a site is behind basic auth, the resulting static site just has the failed authorization error for every page.
To Reproduce Steps to reproduce the behavior:
Environment (please complete the following information):
Additional context From what I could find, the issue is with the headers. I fond the following:
The plugin uses code similar to this in Crawler & SitemapParser & DetectSitempasURLs
new Request()
expects actual headers, soAuthorization: Basic ###
, but the code above is formatted for client options.Using
$response = $this->client->request( 'GET', $url, $headers );
works because the expected 3rd argument is $options and the format of the $headers is['auth' => ['user', 'pass']]
.In my local copy I was able to get it working by making those changes and also changing DetectSitemapsURLs::detect() to pass in the guzzle options to SitemapParser.
These changes seemed to work fine on one site, but on another site the plugin kept failing saying wp-sitemap.xml was not found. So i didn't feel comfortable submitting a pull request, but I hope this helps.