Closed danoli3 closed 3 months ago
Usually I have to delete downloads folder so libs install correctly. Yesterday I couldn't install the ones in ./scripts/macos folder. should we name it _downloads?
That’s a problem with wget2. I’m yet to write upstream report wget2 is now buffered with support of cURL should fix the issues. I would just use cURL though it’s in parallel now
On Thu, 22 Aug 2024 at 11:36 AM, Dimitre @.***> wrote:
Usually I have to delete downloads folder so libs install correctly. Yesterday I couldn't install the ones in ./scripts/macos folder. should we name it _downloads?
— Reply to this email directly, view it on GitHub https://github.com/openframeworks/openFrameworks/pull/8089#issuecomment-2303445097, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGK2HGX4XLXGFV6OYAXD7TZSU6CNAVCNFSM6AAAAABM3JQP2GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMBTGQ2DKMBZG4 . You are receiving this because you modified the open/close state.Message ID: @.***>
I think it is the caching issue. If I run the same script again it returns like this and stops
openFrameworks download_libs.sh v2.5.0
-----
[openFrameworks downloader v4.2.7] ...
curl: (2) no URL specified
curl: try 'curl --help' or 'curl --manual' for more information
[downloader] enabled brotli/zlib losslesss compression response
[downloader] check if local == remote: [https://github.com/openframeworks/apothecary/releases/download/bleeding/openFrameworksLibs_bleeding_macos_1.tar.bz2]
[downloader] Found download cache.
[cache] [openFrameworksLibs_bleeding_macos_1.tar.bz2]
[downloader] Remote size:[126MB] | Local size:[126MB]
whats your system config: -wget2 installed -curl installed
can you open dev/downloader.sh and set it to VERBOSE=1
that cURL check if you have wget2 that the url is failing for should resolve the cache issue but that looks like it thinks the files are the same and will not download them... since its to the bit, its solid compare for new release.
was this issue happening previously, it might be the connection close for wget2. I'll disable that and set timer for wget2. it will still close the port if curl header checks done however
Noticing some issues with wget2 still on windows causing json headers in a few cases to not be extracted due to the zip file not completing, inconsistent