ivan-hc / AM

AppImage package manager to install, update (for real) and manage ALL of them (system-wide or locally) thanks to its ever-growing AUR-inspired database listing 2000+ portable apps and programs for GNU/Linux. The first, real centralized repository to manage your AppImages with the ease of APT and the power of PacMan.
https://portable-linux-apps.github.io
GNU General Public License v3.0
445 stars 32 forks source link

Please consider implementing of downloading resume not from start - but continue #828

Closed vitaly-zdanevich closed 1 month ago

vitaly-zdanevich commented 1 month ago
Steam-1.0.0.79-2-1-x86_64.AppImage                23%[========================>                                                                                 ] 140.00M   855KB/s    in 9m 44s  

2024-08-09 23:40:21 (245 KB/s) - Connection closed at byte 146800640. Retrying.

and after installing the same steam again - it starts from 0.

For example Gentoo package manager Portage continue downloading from 23% in such case.

vitaly-zdanevich commented 1 month ago

...or/and add more retries?

vitaly-zdanevich commented 1 month ago

Hmmm....

# am -i steam --debug
############################################################################
##                                                                        ##
##                  START OF ALL INSTALLATION PROCESSES                   ##
##                                                                        ##
############################################################################

◆ "STEAM": starting installation script

--2024-08-09 23:42:27--  https://github.com/ivan-hc/Steam-appimage/releases/download/continuous/Steam-1.0.0.79-2-1-x86_64.AppImage
Resolving github.com... 140.82.121.3
Connecting to github.com|140.82.121.3|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://objects.githubusercontent.com/github-production-release-asset-2e65be/556393966/63eed0e1-795e-41d3-9d34-13df9ff580c1?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=releaseassetproduction%2F20240809%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240809T194228Z&X-Amz-Expires=300&X-Amz-Signature=fe69795a095f59c606f26d08d219d015e7a164ca04b96d377cab1a4df3d89e7d&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=556393966&response-content-disposition=attachment%3B%20filename%3DSteam-1.0.0.79-2-1-x86_64.AppImage&response-content-type=application%2Foctet-stream [following]
--2024-08-09 23:42:28--  https://objects.githubusercontent.com/github-production-release-asset-2e65be/556393966/63eed0e1-795e-41d3-9d34-13df9ff580c1?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=releaseassetproduction%2F20240809%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240809T194228Z&X-Amz-Expires=300&X-Amz-Signature=fe69795a095f59c606f26d08d219d015e7a164ca04b96d377cab1a4df3d89e7d&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=556393966&response-content-disposition=attachment%3B%20filename%3DSteam-1.0.0.79-2-1-x86_64.AppImage&response-content-type=application%2Foctet-stream
Resolving objects.githubusercontent.com... 185.199.110.133, 185.199.108.133, 185.199.111.133, ...
Connecting to objects.githubusercontent.com|185.199.110.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 619250008 (591M) [application/octet-stream]
Saving to: ‘Steam-1.0.0.79-2-1-x86_64.AppImage’

Steam-1.0.0.79-2-1-x86_64.AppImage                49%[===================================================>                                                      ] 290.00M   807KB/s    in 9m 32s  

2024-08-09 23:52:00 (519 KB/s) - Connection closed at byte 304087040. Retrying.

--2024-08-09 23:52:01--  (try: 2)  https://objects.githubusercontent.com/github-production-release-asset-2e65be/556393966/63eed0e1-795e-41d3-9d34-13df9ff580c1?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=releaseassetproduction%2F20240809%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240809T194228Z&X-Amz-Expires=300&X-Amz-Signature=fe69795a095f59c606f26d08d219d015e7a164ca04b96d377cab1a4df3d89e7d&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=556393966&response-content-disposition=attachment%3B%20filename%3DSteam-1.0.0.79-2-1-x86_64.AppImage&response-content-type=application%2Foctet-stream
Connecting to objects.githubusercontent.com|185.199.110.133|:443... connected.
HTTP request sent, awaiting response... 401 Unauthorized

Username/Password Authentication Failed.

 💀 ERROR DURING INSTALLATION, REMOVED "STEAM"!                           
vitaly-zdanevich commented 1 month ago

...and after restarting the command - it downloading again.

vitaly-zdanevich commented 1 month ago

Maybe the connection closed because now I am in a cafe and my network is slow.

ivan-hc commented 1 month ago

to resume downloads, we would need to have that feature in wget itself

vitaly-zdanevich commented 1 month ago

...again and again - I cannot install steam :(

vitaly-zdanevich commented 1 month ago

to resume downloads, we would need to have that feature in wget itself

Maybe it is possible? Or with curl?

https://unix.stackexchange.com/questions/165875/resume-failed-download-using-linux-command-line-tool

vitaly-zdanevich commented 1 month ago

...from man wget:

-c, --continue
Continue getting a partially-downloaded file.  This is useful when you want to finish up a download started by a previous instance of Wget
ivan-hc commented 1 month ago

...again and again - I cannot install steam :(

are you sure you have not reached the github API limit?

EDIT: if so, wait one hour at least

ivan-hc commented 1 month ago

I'm reading the full documentation, for wget2 instead (you know, Fedora have replaced wget with a symlink to wget2)

   -c, --continue
       Continue getting a partially-downloaded file.  This is useful  when  you
       want to finish up a download started by a previous instance of Wget2, or
       by another program.  For instance:

                wget2 -c https://example.com/tarball.gz

       If there is a file named tarball.gz in the current directory, Wget2 will
       assume that it is the first portion of the remote file, and will ask the
       server  to  continue the retrieval from an offset equal to the length of
       the local file.

       Note that you don’t need to specify this option if  you  just  want  the
       current  invocation of Wget2 to retry downloading a file should the con‐
       nection be lost midway through.  This is the default behavior.  -c  only
       affects  resumption  of  downloads  started  prior to this invocation of
       Wget2, and whose local files are still sitting around.

       Without -c, the previous example would just download the remote file  to
       tarball.gz.1, leaving the truncated tarball.gz file alone.

       If you use -c on a non-empty file, and it turns out that the server does
       not  support continued downloading, Wget2 will refuse to start the down‐
       load from scratch, which would effectively ruin existing  contents.   If
       you really want the download to start from scratch, remove the file.

       If you use -c on a file which is of equal size as the one on the server,
       Wget2 will refuse to download the file and print an explanatory message.
       The  same  happens  when  the file is smaller on the server than locally
       (presumably because it was changed on the server since your  last  down‐
       load  attempt).  Because “continuing” is not meaningful, no download oc‐
       curs.

       On the other side of the coin, while using -c, any file that’s bigger on
       the server than locally will be considered an  incomplete  download  and
       only  “(length(remote)  -  length(local))”  bytes will be downloaded and
       tacked onto the end of the local file.  This behavior can  be  desirable
       in  certain  cases.  For instance, you can use wget2 -c to download just
       the new portion that’s been appended to a data collection or log file.

       However, if the file is bigger on the server because it’s been  changed,
       as  opposed  to  just  appended  to,  you’ll end up with a garbled file.
       Wget2 has no way of verifying that the local file is really a valid pre‐
       fix of the remote file.  You need to be especially careful of this  when
       using  -c in conjunction with -r, since every file will be considered as
       an “incomplete download” candidate.

       Another instance where you’ll get a garbled file if you try to use -c is
       if you have a lame HTTP proxy  that  inserts  a  “transfer  interrupted”
       string  into  the  local file.  In the future a “rollback” option may be
       added to deal with this case.

       Note that -c only works with HTTP servers that support the “Range” head‐
       er.
ivan-hc commented 1 month ago

I have both wget and wget2 on Debian, so I can test the behaviour.

ivan-hc commented 1 month ago

@vitaly-zdanevich just curious, why an interrupted download shoul start from where it has been interrupted?

in case of a previous broken installation, it will be removed and then everything will be re-installed again.

broken installations are detected, for example, if a "tmp" directory is still in the application directory, this means that the previous installation was not concluded.

in case something fails during the installation, for example, the download, the installation process will detect that the installation fail, and then the app will be removed.

this is why the first thing that is created is the "remove" script.

so, I don't think we need to add the -c option for wget

vitaly-zdanevich commented 1 month ago

@vitaly-zdanevich just curious, why an interrupted download shoul start from where it has been interrupted?

It is the default behaviour on Gentoo Linux, and this is good - resume downloading mean faster install on connectivity problems.

Maybe split broken installation and not-finished download?

Yesterday I was unable to install anything from am - on slow Wi-Fi.

ivan-hc commented 1 month ago

@vitaly-zdanevich just curious, why an interrupted download shoul start from where it has been interrupted?

It is the default behaviour on Gentoo Linux, and this is good - resume downloading mean faster install on connectivity problems.

Maybe split broken installation and not-finished download?

Yesterday I was unable to install anything from am - on slow Wi-Fi.

also all depends on how often you call github APIs.

Github API calls can be used not more than 50 times per hour.

Once reached that limit, you need to wait one hour...

...or you need to use a VPN or torsocks to bybass that limit.

ivan-hc commented 1 month ago

up

ivan-hc commented 1 month ago

@vitaly-zdanevich any answer?

vitaly-zdanevich commented 1 month ago

What answer?

ivan-hc commented 1 month ago

What answer?

whas your issue because you downloaded too many things from github in one hour?

vitaly-zdanevich commented 1 month ago

No...

I think maybe GitHub downloading session is limited by time?

ivan-hc commented 1 month ago

yes, not more than 50 per hour

ivan-hc commented 1 month ago

https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28

https://docs.github.com/en/rest/rate-limit/rate-limit?apiVersion=2022-11-28

ivan-hc commented 1 month ago

this function should fix this issue

function _install_common_patch() {
    if ! grep -q -- "--debug" "$AMCACHEDIR"/install-args; then
        # Patch "wget" (version 1.x) to have a progress bar and fix errors when applying patches
        if wget --version | head -1 | grep -q ' 1.'; then
            sed -i "s#wget #wget -cq --no-verbose --show-progress --progress=bar #g" ./"$arg"
        else
            sed -i "s#wget #wget -c #g" ./"$arg"
        fi
    fi
    sed -i "s# https://api.github.com#$HeaderAuthWithGITPAT https://api.github.com#g" ./"$arg"
}

so this will work with both wget and wget2

however, I'd like to know more about the advantages of this over normal usage.

ivan-hc commented 1 month ago

in fact I did a few of tests, and also by removing the command that will check if a "tmp" directory is still in place, the installation still refuses to go on.

the option is good, but for a standalone usage of wget, in case you want to installa a big file.

But the way is wrote AM, if for example you want to install "0ad" and the installation is sinterrupted:

in all cases, the use of -c can't be applied in AM, since it rely only on the reached installation

I have done a couple of tests yesterday, and other deeper have been made today

it is a total rewrite of some essential functions

and nte that the "tmp" directory is not just to download a file, big or small, but also to extract archives, build an app on the fly... and AM must take account of all use cases. So the only solution adopted until now is that the app must be downloaded, completelly.

I'll keep this opened, while @Samueru-sama is working on other things right now, on the same module.