Closed jimvandervoort closed 11 months ago
seeing the same issue, happens on Mac and self-hosted
@99 I have been unable to solve this issue, ended up uploading my artifacts to an S3 bucket using the AWS cli.
Actual error.
Error: Unexpected response. Unable to upload chunk to https://pipelines.actions.githubusercontent.com/TpHcyMMFQmnpS2gAIwipfQlCeRFQYvYjaFihb10o3AoBbT48sZ/_apis/resources/Containers/17554032?itemPath=.net-app%5Cwwwroot%5Cassets%5Cjs%5Cdemo.js
##### Begin Diagnostic HTTP information #####
Status Code: 400
Status Message: Bad Request
Header Information: {
"cache-control": "no-store,no-cache",
"pragma": "no-cache",
"transfer-encoding": "chunked",
"content-type": "application/json; charset=utf-8",
"strict-transport-security": "max-age=2592000",
"x-tfs-processid": "a1264ee8-9811-47fd-b621-14c5eb188cab",
"activityid": "d82c5424-94c2-4f7f-8462-467a55705438",
"x-tfs-session": "d82c5424-94c2-4f7f-8462-467a55705438",
"x-vss-e2eid": "d82c5424-94c2-4f7f-8462-467a55705438",
"x-vss-senderdeploymentid": "a07ab14e-025a-39c3-8d53-788cd7ce207f",
"x-frame-options": "SAMEORIGIN",
"x-cache": "CONFIG_NOCACHE",
"x-msedge-ref": "Ref A: 24740862B63345E297D117EC69354365 Ref B: BL2EDGE1817 Ref C: 2021-12-08T11:22:03Z",
"date": "Wed, 08 Dec 2021 11:22:02 GMT"
}
###### End Diagnostic HTTP information ######
Having the same problem on windows-latest
.
I have the same error also, windows-latest:
Error: Unexpected response. Unable to upload chunk to https://pipelines.actions.githubusercontent.com/GzWTsAc8Tfkk7NAzxC0CrvDFhIPAaUpAFRbZzar8Syjs8JLKSw/_apis/resources/Containers/3142291?itemPath=ASP-app%5CApp_Plugins%5CExercise.css
##### Begin Diagnostic HTTP information #####
Status Code: 400
Status Message: Bad Request
Header Information: {
"cache-control": "no-store,no-cache",
"pragma": "no-cache",
"transfer-encoding": "chunked",
"content-type": "application/json; charset=utf-8",
"strict-transport-security": "max-age=2592000",
"x-tfs-processid": "dbcd0f50-a18e-4b5a-9184-06101c692ab9",
"activityid": "32445705-393f-4869-91c5-b1a66e43f437",
"x-tfs-session": "32445705-393f-4869-91c5-b1a66e43f437",
"x-vss-e2eid": "32445705-393f-4869-91c5-b1a66e43f437",
"x-vss-senderdeploymentid": "6be79bb9-f6f7-24a7-0b27-e718f2ab4200",
"x-frame-options": "SAMEORIGIN",
"x-cache": "CONFIG_NOCACHE",
"x-msedge-ref": "Ref A: 7B60BF0CB4314C6497D704CAE0154C8F Ref B: PAOEDGE0616 Ref C: 2021-12-10T09:11:05Z",
"date": "Fri, 10 Dec 2021 09:11:05 GMT"
}
###### End Diagnostic HTTP information ######
Following the README's workaround for throttled requests, compressing the folder before uploading made it work for me, although the zip was nested (https://github.com/actions/upload-artifact/issues/39).
Perhaps 400 errors are sent by the server in place of 429 for us for whatever reason. Maybe this info about my folder upload attempts can help someone who knows about the rate limits:
Appears that it may be caused by empty files. See this workflow run:
https://github.com/skyrim-multiplayer/skymp/runs/4490126560?check_suite_focus=true
As can be seen from the job log, action was failing, but, after removing the empty file, artifact upload was successful.
Another example:
Log diff shows that both runner image and this action were updated:
UPD
Looks like this is caused by the recent release. Reverted action to 27121b0 (v2.2.4), and it stopped failing.
Run: https://github.com/skyrim-multiplayer/skymp/runs/4490672804?check_suite_focus=true
See PR: https://github.com/skyrim-multiplayer/skymp/pull/630
Thanks @nic11! Reverting to v2.2.4 worked for me too.
Thanks @nic11! Great job. It helped me too
Turns out there was a separate issue for this, though I googled this one :)
Fixed in #281
I had this problem and the solution for me was to introduce a period of waiting.
- name: Setup the database within Docker
run: |
echo 'Starting the db'
docker-compose -f ./docker_files/docker-compose_dbOnly.yml -p mynameapp up -d
- name: Sleep for 15 seconds
run: sleep 15s
shell: bash
- name: Unit tests - pytest
env:
TEST_POSTGRES: "postgresql://username:password@localhost:5432/test_temp"
AUTO_TEST: true;
run: pytest
- name: Upload artifact for deployment job
uses: actions/upload-artifact@v2
with:
name: python-app
path: .
The code was correct and worked prior to the introduction to docker. My guess is that it is linked to having a file locked during the compose that was not released quickly. This seems really odd to me but it now works.
V4 upload-artifact has released today! Recommend switching over.
https://github.blog/changelog/2023-12-14-github-actions-artifacts-v4-is-now-generally-available/
v4 is a complete rewrite of the artifact actions with a new backend. v1-v3 uploads sometimes would hit 100% or close to 100% and things would just stop and fail due to mysterious reasons. Sometimes there would also be a small amount of transient errors like 500s or 400s that you see. v4 is all around more reliable, simpler and a host of issues described in this issue should no longer happen.
If there are any similar issues with v4 then please open up new issues
Describe the bug
Our ci systems sometimes returns HTTP 440 when using upload-artifact.
Version
Environment
Screenshots If applicable, add screenshots to help explain your problem.
Run/Repo Url Tried to reproduce our setup in jimvandervoort/gh-actions-error. No luck so far.
How to reproduce Not sure yet, our setup looks like this:
We build multiple themes from our frontend. These themes may or may not produce some of the same files across builds (since some themes only add files like new image resoruces, while others modify JS, causing a different webpack bundle to be created).
We build multiple themes in a matrix job.
Assets are all build under
public/theme-name/dist/js/app/.[hash].js
.The ci uploads all js and css files in one folder:
dist/js/app.[hash].js
, removing the name of the theme from the path. The idea here is that if two themes build the same file, only one is included (since the hash of the file is in the filename).Additional context
Error reported in run (I replaced the name of a specific directory with
some-theme
since the original name included the name of a private customer):