Closed dboakes closed 3 years ago
I am experiencing the same issue with a private GitHub repo. My question is why is my auto update even attempting to check AWS S3? No where within my configuration am I publishing to AWS S3. Here is the error message received:
Checking for update Error: HttpError: 400 "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>InvalidArgument</Code><Message>Only one auth mechanism allowed; only the X-Amz-Algorithm query parameter, Signature query string parameter or the Authorization header should be specified</Message><ArgumentName>Authorization</ArgumentName><ArgumentValue>token <GITHUB_TOKEN></ArgumentValue><RequestId>8CEA7569D23AFA24</RequestId><HostId>yARBtPCpNEB6aQ4b7PiSQbix/MSHeXr7/HN7ZhlAElWsnJpoxwYpCoCmfJpCcHJJSHeCbhlh43w=</HostId></Error>" Headers: { "status": "400", "x-amz-request-id": "8CEA7569D23AFA24", "x-amz-id-2": "yARBtPCpNEB6aQ4b7PiSQbix/MSHeXr7/HN7ZhlAElWsnJpoxwYpCoCmfJpCcHJJSHeCbhlh43w=", "content-type": "application/xml", "server": "AmazonS3", "accept-ranges": "bytes", "via": "1.1 varnish, 1.1 varnish", "date": "Thu, 11 Feb 2021 18:44:54 GMT", "x-served-by": "cache-dca12925-DCA, cache-mrs10565-MRS", "x-cache": "MISS, MISS", "x-cache-hits": "0, 0", "strict-transport-security": "max-age=31536000", "x-fastly-request-id": "444c3542b3369d5cc49bacfa2f63848eed31db77" } at d (C:\Users\DouglassHock\AppData\Local\Programs\reportapp\resources\app.asar\background.js:2:233011) at IncomingMessage.<anonymous> (C:\Users\DouglassHock\AppData\Local\Programs\reportapp\resources\app.asar\background.js:2:235343) at IncomingMessage.emit (events.js:310:20) at endReadableNT (_stream_readable.js:1187:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) Unhandled Rejection at: Promise { <rejected> m [HttpError]: 400 "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>InvalidArgument</Code><Message>Only one auth mechanism allowed; only the X-Amz-Algorithm query parameter, Signature query string parameter or the Authorization header should be specified</Message><ArgumentName>Authorization</ArgumentName><ArgumentValue>token <GITHUB_TOKEN></ArgumentValue><RequestId>8CEA7569D23AFA24</RequestId><HostId>yARBtPCpNEB6aQ4b7PiSQbix/MSHeXr7/HN7ZhlAElWsnJpoxwYpCoCmfJpCcHJJSHeCbhlh43w=</HostId></Error>" Headers: { "status": "400", "x-amz-request-id": "8CEA7569D23AFA24", "x-amz-id-2": "yARBtPCpNEB6aQ4b7PiSQbix/MSHeXr7/HN7ZhlAElWsnJpoxwYpCoCmfJpCcHJJSHeCbhlh43w=", "content-type": "application/xml", "server": "AmazonS3", "accept-ranges": "bytes", "via": "1.1 varnish, 1.1 varnish", "date": "Thu, 11 Feb 2021 18:44:54 GMT", "x-served-by": "cache-dca12925-DCA, cache-mrs10565-MRS", "x-cache": "MISS, MISS", "x-cache-hits": "0, 0", "strict-transport-security": "max-age=31536000", "x-fastly-request-id": "444c3542b3369d5cc49bacfa2f63848eed31db77" } at d (C:\Users\DouglassHock\AppData\Local\Programs\reportapp\resources\app.asar\background.js:2:233011) at IncomingMessage.<anonymous> (C:\Users\DouglassHock\AppData\Local\Programs\reportapp\resources\app.asar\background.js:2:235343) at IncomingMessage.emit (events.js:310:20) at endReadableNT (_stream_readable.js:1187:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) { statusCode: 400, description: '<?xml version="1.0" encoding="UTF-8"?>\n' + '<Error><Code>InvalidArgument</Code><Message>Only one auth mechanism allowed; only the X-Amz-Algorithm query parameter, Signature query string parameter or the Authorization header should be specified</Message><ArgumentName>Authorization</ArgumentName><ArgumentValue>token <GITHUB_TOKEN></ArgumentValue><RequestId>8CEA7569D23AFA24</RequestId><HostId>yARBtPCpNEB6aQ4b7PiSQbix/MSHeXr7/HN7ZhlAElWsnJpoxwYpCoCmfJpCcHJJSHeCbhlh43w=</HostId></Error>', code: 'HTTP_ERROR_400' } } reason: HttpError: 400
welcome to the club buddy, we're all waiting for the developers to merge the PR with the fix for this issue. it's also not their fault it's more on AWS side of things because of some aauth credentials or tokens.
Well that better explains why older versions of my application (which previously worked) are experiencing the same issue.
I figured out the solution for this and this will worked for other users also who already downloaded the app(added Screenshot) these changes you can do in your pipeline
//////////////////////////////////////////////////////////////////// "use strict";
Object.defineProperty(exports, "__esModule", { value: true }); exports.createHttpError = createHttpError; exports.parseJson = parseJson; exports.configureRequestOptionsFromUrl = configureRequestOptionsFromUrl; exports.configureRequestUrl = configureRequestUrl; exports.safeGetHeader = safeGetHeader; exports.configureRequestOptions = configureRequestOptions; exports.safeStringifyJson = safeStringifyJson; exports.DigestTransform = exports.HttpExecutor = exports.HttpError = void 0;
function _crypto() { const data = require("crypto");
_crypto = function () { return data; };
return data; }
var _debug2 = _interopRequireDefault(require("debug"));
var _fs = require("fs");
function _stream() { const data = require("stream");
_stream = function () { return data; };
return data; }
function _url() { const data = require("url");
_url = function () { return data; };
return data; }
function _CancellationToken() { const data = require("./CancellationToken");
_CancellationToken = function () { return data; };
return data; }
function _index() { const data = require("./index");
_index = function () { return data; };
return data; }
function _ProgressCallbackTransform() { const data = require("./ProgressCallbackTransform");
_ProgressCallbackTransform = function () { return data; };
return data; }
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
const debug = (0, _debug2.default)("electron-builder");
function createHttpError(response, description = null) {
return new HttpError(response.statusCode || -1, ${response.statusCode} ${response.statusMessage}
+ (description == null ? "" : "\n" + JSON.stringify(description, null, " ")) + "\nHeaders: " + safeStringifyJson(response.headers), description);
}
const HTTP_STATUS_CODES = new Map([[429, "Too many requests"], [400, "Bad request"], [403, "Forbidden"], [404, "Not found"], [405, "Method not allowed"], [406, "Not acceptable"], [408, "Request timeout"], [413, "Request entity too large"], [500, "Internal server error"], [502, "Bad gateway"], [503, "Service unavailable"], [504, "Gateway timeout"], [505, "HTTP version not supported"]]);
class HttpError extends Error {
constructor(statusCode, message = HTTP error: ${HTTP_STATUS_CODES.get(statusCode) || statusCode}
, description = null) {
super(message);
this.statusCode = statusCode;
this.description = description;
this.name = "HttpError";
this.code = HTTP_ERROR_${statusCode}
;
}
}
exports.HttpError = HttpError;
function parseJson(result) { return result.then(it => it == null || it.length === 0 ? null : JSON.parse(it)); }
class HttpExecutor { constructor() { this.maxRedirects = 10; }
request(options, cancellationToken = new (_CancellationToken().CancellationToken)(), data) { configureRequestOptions(options); const encodedData = data == null ? undefined : Buffer.from(JSON.stringify(data));
if (encodedData != null) {
options.method = "post";
options.headers["Content-Type"] = "application/json";
options.headers["Content-Length"] = encodedData.length;
}
return this.doApiRequest(options, cancellationToken, it => {
it.end(encodedData);
});
}
doApiRequest(options, cancellationToken, requestProcessor, redirectCount = 0) {
if (debug.enabled) {
debug(Request: ${safeStringifyJson(options)}
);
}
return cancellationToken.createPromise((resolve, reject, onCancel) => {
const request = this.createRequest(options, response => {
try {
this.handleResponse(response, options, cancellationToken, resolve, reject, redirectCount, requestProcessor);
} catch (e) {
reject(e);
}
});
this.addErrorAndTimeoutHandlers(request, reject);
this.addRedirectHandlers(request, options, reject, redirectCount, options => {
this.doApiRequest(options, cancellationToken, requestProcessor, redirectCount).then(resolve).catch(reject);
});
requestProcessor(request, reject);
onCancel(() => request.abort());
});
} // noinspection JSUnusedLocalSymbols // eslint-disable-next-line
addRedirectHandlers(request, options, reject, redirectCount, handler) {// not required for NodeJS }
addErrorAndTimeoutHandlers(request, reject) { this.addTimeOutHandler(request, reject); request.on("error", reject); request.on("aborted", () => { reject(new Error("Request has been aborted by the server")); }); }
handleResponse(response, options, cancellationToken, resolve, reject, redirectCount, requestProcessor) {
if (debug.enabled) {
debug(Response: ${response.statusCode} ${response.statusMessage}, request options: ${safeStringifyJson(options)}
);
} // we handle any other >= 400 error on request end (read detailed message in the response body)
if (response.statusCode === 404) {
// error is clear, we don't need to read detailed error description
reject(createHttpError(response, `method: ${options.method || "GET"} url: ${options.protocol || "https:"}//${options.hostname}${options.port ? `:${options.port}` : ""}${options.path}
Please double check that your authentication token is correct. Due to security reasons actual status maybe not reported, but 404. `)); return; } else if (response.statusCode === 204) { // on DELETE request resolve(); return; }
const redirectUrl = safeGetHeader(response, "location");
if (redirectUrl != null) {
if (redirectCount > this.maxRedirects) {
reject(this.createMaxRedirectError());
return;
}
this.doApiRequest(HttpExecutor.prepareRedirectUrlOptions(redirectUrl, options), cancellationToken, requestProcessor, redirectCount).then(resolve).catch(reject);
return;
}
response.setEncoding("utf8");
let data = "";
response.on("error", reject);
response.on("data", chunk => data += chunk);
response.on("end", () => {
try {
if (response.statusCode != null && response.statusCode >= 400) {
const contentType = safeGetHeader(response, "content-type");
const isJson = contentType != null && (Array.isArray(contentType) ? contentType.find(it => it.includes("json")) != null : contentType.includes("json"));
reject(createHttpError(response, isJson ? JSON.parse(data) : data));
} else {
resolve(data.length === 0 ? null : data);
}
} catch (e) {
reject(e);
}
});
}
async downloadToBuffer(url, options) { return await options.cancellationToken.createPromise((resolve, reject, onCancel) => { let result = null; const requestOptions = { headers: options.headers || undefined, // because PrivateGitHubProvider requires HttpExecutor.prepareRedirectUrlOptions logic, so, we need to redirect manually redirect: "manual" }; configureRequestUrl(url, requestOptions); configureRequestOptions(requestOptions); this.doDownload(requestOptions, { destination: null, options, onCancel, callback: error => { if (error == null) { resolve(result); } else { reject(error); } }, responseHandler: (response, callback) => { const contentLength = safeGetHeader(response, "content-length"); let position = -1;
if (contentLength != null) {
const size = parseInt(contentLength, 10);
if (size > 0) {
if (size > 52428800) {
callback(new Error("Maximum allowed size is 50 MB"));
return;
}
result = Buffer.alloc(size);
position = 0;
}
}
response.on("data", chunk => {
if (position !== -1) {
chunk.copy(result, position);
position += chunk.length;
} else if (result == null) {
result = chunk;
} else {
if (result.length > 52428800) {
callback(new Error("Maximum allowed size is 50 MB"));
return;
}
result = Buffer.concat([result, chunk]);
}
});
response.on("end", () => {
if (result != null && position !== -1 && position !== result.length) {
callback(new Error(`Received data length ${position} is not equal to expected ${result.length}`));
} else {
callback(null);
}
});
}
}, 0);
});
}
doDownload(requestOptions, options, redirectCount) {
const request = this.createRequest(requestOptions, response => {
if (response.statusCode >= 400) {
options.callback(new Error(Cannot download "${requestOptions.protocol || "https:"}//${requestOptions.hostname}${requestOptions.path}", status ${response.statusCode}: ${response.statusMessage}
));
return;
} // It is possible for the response stream to fail, e.g. when a network is lost while
// response stream is in progress. Stop waiting and reject so consumer can catch the error.
response.on("error", options.callback); // this code not relevant for Electron (redirect event instead handled)
const redirectUrl = safeGetHeader(response, "location");
if (redirectUrl != null) {
if (redirectCount < this.maxRedirects) {
this.doDownload(HttpExecutor.prepareRedirectUrlOptions(redirectUrl, requestOptions), options, redirectCount++);
} else {
options.callback(this.createMaxRedirectError());
}
return;
}
if (options.responseHandler == null) {
configurePipes(options, response);
} else {
options.responseHandler(response, options.callback);
}
});
this.addErrorAndTimeoutHandlers(request, options.callback);
this.addRedirectHandlers(request, requestOptions, options.callback, redirectCount, requestOptions => {
this.doDownload(requestOptions, options, redirectCount++);
});
request.end();
}
createMaxRedirectError() {
return new Error(Too many redirects (> ${this.maxRedirects})
);
}
addTimeOutHandler(request, callback) { request.on("socket", socket => { socket.setTimeout(60 * 1000, () => { request.abort(); callback(new Error("Request timed out")); }); }); }
static prepareRedirectUrlOptions(redirectUrl, options) { const newOptions = configureRequestOptionsFromUrl(redirectUrl, { ...options }); const headers = newOptions.headers;
if (headers != null && headers.authorization != null && headers.authorization.startsWith("token")) {
const parsedNewUrl = new (_url().URL)(redirectUrl);
// if (parsedNewUrl.hostname.endsWith(".amazonaws.com")) {
if (parsedNewUrl.hostname.endsWith(".amazonaws.com") || parsedNewUrl.searchParams.has("X-Amz-Credential")) {
delete headers.authorization;
}
}
return newOptions;
}
}
exports.HttpExecutor = HttpExecutor;
function configureRequestOptionsFromUrl(url, options) { const result = configureRequestOptions(options); configureRequestUrl(new (_url().URL)(url), result); return result; }
function configureRequestUrl(url, options) { options.protocol = url.protocol; options.hostname = url.hostname;
if (url.port) { options.port = url.port; } else if (options.port) { delete options.port; }
options.path = url.pathname + url.search; }
class DigestTransform extends _stream().Transform { constructor(expected, algorithm = "sha512", encoding = "base64") { super(); this.expected = expected; this.algorithm = algorithm; this.encoding = encoding; this._actual = null; this.isValidateOnEnd = true; this.digester = (0, _crypto().createHash)(algorithm); } // noinspection JSUnusedGlobalSymbols
get actual() { return this._actual; } // noinspection JSUnusedGlobalSymbols
_transform(chunk, encoding, callback) { this.digester.update(chunk); callback(null, chunk); } // noinspection JSUnusedGlobalSymbols
_flush(callback) { this._actual = this.digester.digest(this.encoding);
if (this.isValidateOnEnd) {
try {
this.validate();
} catch (e) {
callback(e);
return;
}
}
callback(null);
}
validate() { if (this._actual == null) { throw (0, _index().newError)("Not finished yet", "ERR_STREAM_NOT_FINISHED"); }
if (this._actual !== this.expected) {
throw (0, _index().newError)(`${this.algorithm} checksum mismatch, expected ${this.expected}, got ${this._actual}`, "ERR_CHECKSUM_MISMATCH");
}
return null;
}
}
exports.DigestTransform = DigestTransform;
function checkSha2(sha2Header, sha2, callback) {
if (sha2Header != null && sha2 != null && sha2Header !== sha2) {
callback(new Error(checksum mismatch: expected ${sha2} but got ${sha2Header} (X-Checksum-Sha2 header)
));
return false;
}
return true; }
function safeGetHeader(response, headerKey) { const value = response.headers[headerKey];
if (value == null) { return null; } else if (Array.isArray(value)) { // electron API return value.length === 0 ? null : value[value.length - 1]; } else { return value; } }
function configurePipes(options, response) { if (!checkSha2(safeGetHeader(response, "X-Checksum-Sha2"), options.options.sha2, options.callback)) { return; }
const streams = [];
if (options.options.onProgress != null) { const contentLength = safeGetHeader(response, "content-length");
if (contentLength != null) {
streams.push(new (_ProgressCallbackTransform().ProgressCallbackTransform)(parseInt(contentLength, 10), options.options.cancellationToken, options.options.onProgress));
}
}
const sha512 = options.options.sha512;
if (sha512 != null) { streams.push(new DigestTransform(sha512, "sha512", sha512.length === 128 && !sha512.includes("+") && !sha512.includes("Z") && !sha512.includes("=") ? "hex" : "base64")); } else if (options.options.sha2 != null) { streams.push(new DigestTransform(options.options.sha2, "sha256", "hex")); }
const fileOut = (0, _fs.createWriteStream)(options.destination); streams.push(fileOut); let lastStream = response;
for (const stream of streams) { stream.on("error", error => { if (!options.options.cancellationToken.cancelled) { options.callback(error); } }); lastStream = lastStream.pipe(stream); }
fileOut.on("finish", () => { fileOut.close(options.callback); }); }
function configureRequestOptions(options, token, method) { if (method != null) { options.method = method; }
options.headers = { ...options.headers }; const headers = options.headers;
if (token != null) {
headers.authorization = token.startsWith("Basic") ? token : token ${token}
;
}
if (headers["User-Agent"] == null) { headers["User-Agent"] = "electron-builder"; }
if (method == null || method === "GET" || headers["Cache-Control"] == null) { headers["Cache-Control"] = "no-cache"; } // do not specify for node (in any case we use https module)
if (options.protocol == null && process.versions.electron != null) { options.protocol = "https:"; }
return options; }
function safeStringifyJson(data, skippedNames) {
return JSON.stringify(data, (name, value) => {
if (name.endsWith("authorization") || name.endsWith("Password") || name.endsWith("PASSWORD") || name.endsWith("Token") || name.includes("password") || name.includes("token") || skippedNames != null && skippedNames.has(name)) {
return "
return value;
}, 2); } // __ts-babel@6.0.4 //# sourceMappingURL=httpExecutor.js.map ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Currently experiencing this issue.
I spent a lot of time trying to figure out what I have done wrong since I had auto update working fine few weeks ago. Hopefully a proper fix will be available soon, I don't mind having to ask current users to update manually at the moment.
Experiencing same issue (public GitHub repo):
So the fix has been successfully merged, I was wondering how I can obtain the fix now like is it by just running an npm update or do we have to wait for the next release?
We need a new pre-release cut. @develar can you please assist here? 🙂 Not sure what the steps are for publishing a package yet.
how can we upgrade to this pre-release?
https://github.com/electron-userland/electron-builder/releases/tag/v22.10.5
how can we upgrade to this pre-release?
https://github.com/electron-userland/electron-builder/releases/tag/v22.10.5
You can run the following command:
npm update electron-builder@22.10.5
This should place your dependency in the pre-release
Great news! I talked to GitHub support, and they made a server-side change so that applications using the old version of electron-updater can successfully check for updates again!
I've only tested this for an application updating from a private GitHub repository, so someone else will have to see if it's fixed for public repositories too.
There is no more problem. Try the update check again
Yes, now it works as said @NoahAndrews
@amiranagram That's an entirely different error message which looks to be unrelated to this issue. You should open a new issue.
@NoahAndrews Damn I had a few tabs open of issues, and I commented on a wrong one, my bad!
This issue has resurfaced in a different form.
Electron auto update using electron-updater was working great so far. I am using a private github repo and haven't had issues in the mac downloader. Starting yesterday, I started seeing the following error in my logs:
[2021-12-01 08:17:15.113] [info] Checking for update
[2021-12-01 08:17:15.655] [info] Found version 0.3.9 (url: myapp-Setup.zip, myapp-Setup.dmg)
[2021-12-01 08:17:15.655] [info] Downloading update from myapp-Setup.zip, myapp-Setup.dmg
[2021-12-01 08:17:22.601] [info] New version 0.3.9 has been downloaded to /Users/daniel/Library/Application Support/Caches/flout-desktop-updater/pending/update.zip
[2021-12-01 08:17:22.634] [info] / requested
[2021-12-01 08:17:22.637] [info] /1638364642604-7949.zip requested
[2021-12-01 08:17:22.638] [info] /1638364642604-7949.zip requested by Squirrel.Mac, pipe /Users/daniel/Library/Application Support/Caches/flout-desktop-updater/pending/update.zip
[2021-12-01 08:17:27.803] [info] Proxy server for native Squirrel.Mac is closed (was started to download https://api.github.com/repos/myapp-dev/flout-desktop-releases/releases/assets/50638910)
[2021-12-03 09:57:32.252] [info] Checking for update
[2021-12-03 09:57:33.324] [error] Error: HttpError: 400
"<Error><Code>AccessDenied</Code><Message>Multiple auth mechanisms are not allowed; please use either query parameters or an Authorization header</Message><Resource>github-production-release-asset-2e65be</Resource></Error>"
Headers: {
"content-type": "text/plain; charset=utf-8",
"x-github-backend": "Kubernetes",
"x-github-request-id": "29CA:36AD:318815:1832534:61AA305D",
"accept-ranges": "bytes",
"date": "Fri, 03 Dec 2021 14:57:33 GMT",
"via": "1.1 varnish",
"x-served-by": "cache-fty21343-FTY",
"x-cache": "MISS",
"x-cache-hits": "0",
"x-timer": "S1638543453.193440,VS0,VE17",
"content-length": "221"
}
at d (/Applications/myapp.app/Contents/Resources/app.asar/background.js:2:485367)
at IncomingMessage.<anonymous> (/Applications/myapp.app/Contents/Resources/app.asar/background.js:2:487699)
at IncomingMessage.emit (node:events:394:28)
at endReadableNT (node:internal/streams/readable:1343:12)
at processTicksAndRejections (node:internal/process/task_queues:83:21)
As you can see, it downloaded version 0.3.9 successfully but has trouble downloading 0.3.10. This person seems to have found the solution for this issue. It might have to be replicated in electron-updater: https://github.com/octokit/rest.js/issues/6#issuecomment-985420522
As @Daniel-Sudhindaran also mentioned it's true this issue re-surfaced. It also happens again at windows and with mac.
By doing the change mentioned here https://github.com/electron-userland/electron-builder/issues/5595#issuecomment-770019745 it works again. But this is not a viable solution as all of our customeres have the older version installed and can't update to the new one which would have this change.
I guess it's something related to github releases and this does not work again, probably they changed something. As @NoahAndrews mentioned above https://github.com/electron-userland/electron-builder/issues/5595#issuecomment-781444995 he had spoke to Github support and problem was solved back in February.
I guess we should contact them again.
@Daniel-Sudhindaran I contacted Github support and they provided a fix on their server side for this. Issue seems to be fixed now. Checked it on windows and updates work again.
Electron Builder Version: 22.9.1
Electron Version: 9.1.0
Electron Type (current, beta, nightly): current
Electron Updater Version: 4.3.5
Target: mac - dmg and zip
Since yesterday, I've not been able to use the updater on my app. Anytime the app checks for an update, I get the below error message.
I'm aware something like this has happened before, per #1370, however this suddenly popped up on old and new versions of Electron Builder/Updater.
I was using Electron Updater v4.2.2 and Electron Builder v22.7.0, before updating to the latest versions listed above, but regardless, the issue is the same.
I haven't changed anything about my config for weeks, so struggling to figure out what the issue might be. Any help would be appreciated!
ERROR MESSAGE
package.json config: