m-ab-s / media-autobuild_suite

This Windows Batchscript helps setup a Mingw-w64 compiler environment for building ffmpeg and other media tools under Windows.
GNU General Public License v3.0
1.53k stars 265 forks source link

Request: Download first, compile after #2000

Open skycommand opened 3 years ago

skycommand commented 3 years ago

Hi. 😀

Before I begin, I'd like to thank you for your script. It made things a lot easier. 🙏

Now, while the script is very useful, it takes hours to run. During this time, goes over a list of remote repos, pulls the first item, compiles it, then repeats the same for all subsequent items. The pulling takes a few minutes in total. Compilation takes hours, 3 to 5. A simple blip on the Internet breaks the whole process. On many occasions, I've left it to run overnight, only to wake up and see an error message indicated a problem on the remote side that is not the fault of your script. Maybe the remote repo was in maintenance mode and the script tried to reach it during that small window.

But I think a small change in the script could the life easier; one that I myself can't make. What if the script pulled all the remote repos first before beginning the compilation process? The user could run it, wait for 15 minutes while the remote repos get pulled, then leave it unattended for the next five hours? If one of the pulls failed, the user can retry in a few minutes and salvage hours of work.

1480c1 commented 3 years ago

I'm not sure how feasible this would be with our current setup, as we use the status of the downloading command to determine if a rebuild is needed in the first place

skycommand commented 3 years ago

If it was C#, I'd download, decide whether rebuilding is needed, and store the decision into a variable. Once all downloads are finished (I assume 15 minutes), I'd use that group of variables to rebuild as needed (which takes hours).

Looking at the Bash code, it seems the function in charge of downloading is do_vcs. Its return result indicates whether rebuilding must occur. So, it would be nice if we could call all the do_vcs at the beginning of the code, store their results in variables and use those variables in if statements that determine whether rebuilding is needed. But does Bash support variables?

garoto commented 3 years ago

But does Bash support variables?

bruh

skycommand commented 3 years ago

^ I take that as a "yes". 😊

GyanD commented 3 years ago

On somewhat similar lines, what would also be helpful if the script did not automatically abort upon failure to build a ffmpeg dependency.

How it could work is the following:

1) backup (move/rename) build artifacts from the previous build

2) if current build fails, offer to a) abort (the only outcome at present), b) use backed up artifacts, or c) remove dep from ffmpeg configure options.

3) remove back ups of updated artifacts

skycommand commented 3 years ago

@GyanD Then maybe the vcpkg approach is more to your liking. It starts compiling a package (the whole thing that you want) by building a list of artifacts (dependencies and such) the package needs. If a valid artifact is in place, it won't bother the artifact. As for the artifacts that are not in place, it downloads them first, then builds them. Once all the artifacts are in place, it compiles the package.

With this approach, you can resume a failed package compilation with only ten seconds delay, as opposed to hours. Failures are also less likely, because there are fewer moving parts.

Of course, you'd eventually want to update the package. You issue an update command on your own schedule. It updates the artifacts, then rebuilds as needed.

GyanD commented 3 years ago

That takes care of one scenario - a dep can't be downloaded. But the most common scenario is not lack of access to the dep repo but that an updated dep src can't be built or linked successfully with ffmpeg.

skycommand commented 3 years ago

Why? ("Can't be built" is a bit too broad. Can you think of possible reasons besides download?)

GyanD commented 3 years ago

Breaking change. Currently, libass is broken due to underspecified dependencies of its own in in its .pc

This kind of breakage is a semi-regular occurrence.

skycommand commented 3 years ago

Well, that's not a problem with vcpkg, because like I said, once you successfully build an artifact, like libass, it is available and it can be used regardless of subsequent versions that break.

But I think you might still have a problem with this because you only publish static builds. Shared builds don't have this problem, because as long as the EXE can find the DLL and the function signatures match, the entire product works.

GyanD commented 3 years ago

So, it keeps older artifacts around?

skycommand commented 3 years ago

Yes.

It doesn't update or discard any artifact unless you say so. In addition, it has three caches: Downloads, Packages, and Buildtrees. You can reuse or delete all three if you are so inclined.

I think vcpkg has a lot of inspiring features for this project.

wiiaboo commented 3 years ago

You could use Windows File History on the local32/local64 directories to keep a "backup" of previous versions of a library and in case of failures to build, you can usually override the check for updated versions at least for git/svn-using libraries.

skycommand commented 3 years ago

I have problems with the "override the check for updated versions" portion. I don't know how. Would you care to explain?

As for keeping a backup copy of local64, I have more efficient methods.

wiiaboo commented 3 years ago

https://github.com/m-ab-s/media-autobuild_suite#custom-patches

skycommand commented 3 years ago

(Shrugs) 🤷‍♀️