project64 / project64

N64 Emulator
http://www.pj64-emu.com/
GNU General Public License v2.0
2.41k stars 473 forks source link

Establish a buildbot for Project 64 #256

Closed Lithium64 closed 8 years ago

Lithium64 commented 9 years ago

Several others open source emulators have a buildbot, I think it would help quite who doesn't have a visual studio license or has no interest in installing it just to compile the emulator.

The web page below provides a buildbot for various projects http://buildbot.orphis.net/

"The server is running on a dedicated host and new projects could easily be built here. If you want your project to be built and distributed here, send me a mail at "orphis nospam free.fr" and we'll discuss it."

AmbientMalice commented 9 years ago

I like the idea of a PJ64 build bot. But I wish the emulator could be up to snuff before the PJ64 site starts hosting nightly builds. That said, it would be better than them downloading from EmuCR as they currently are.

cxd4 commented 9 years ago

I think it's even easier than downloading EXE/ZIP files off a precompiled builds website, to just compile.

Easier to just git pull and Ctrl+Shift+B in Visual Studio to re-build any and all components that had any code changed. Saves way more time than downloading binaries I think. :P

project64 commented 9 years ago

I think if I ever setup a build bot, which would be nice, it would have to increase version numbers, I also would prefer the bins to be semi private, like posting a new blog post on the beta forum.

I know any one can compile the source, but you have to be a lot more knowledge for that then just downloading a bin. I would want the public really just downloading the stable releases.

AmbientMalice commented 9 years ago

Without an official build bot, people who can't/won't compile but who want to use the latest code will use EmuCR's iffy builds instead. I know that handing normal users direct access to extremely WIP builds can cause a lot of suffering - I would NOT support official nightly builds for the public until after PJ64 2.2 is released, even if I'm supportive of a PJ64 buildbot.

cxd4 commented 9 years ago

I think the keyword is won't compile, rather than can't compile, but if VS2013 was only several KB rather than GB I don't think that would be very excusable. Plenty of lightweight and better-optimizing compilers out there than Microsoft's compiler anyway; it's just that currently things like ATL fix it to MSVC pro so unfortunately that is required.

The time it takes to download an entire package of updated binaries is far more than the CPU time it takes for compiling only the sources to those which were updated. It's simply masochistic to prefer downloading and extracting build-bot releases from your temporary internet cache every day, when you could be hitting Ctrl+Shift+B in VS to update all your binaries by compiling all changes every day. One is clearly less time-consuming than the other.

AmbientMalice commented 9 years ago

Most people don't download a new binary release every single day, though.

cxd4 commented 9 years ago

A little pseudo-scientific conjecture of mine, but,

If doing X every day is faster than doing Y every day, then doing X once in a while is faster than doing Y once in a while.

Orphis commented 9 years ago

Let me know if you need a buildbot for project 64, that shouldn't be a problem at all :)

Lithium64 commented 9 years ago

@project64 The guy is here

cxd4 commented 9 years ago

You were not happy with git pull and Ctrl+Shift+B in VS2013 to build updated sources @Lithium64 ?

AmbientMalice commented 9 years ago

@cxd4 Build bots serve a number of valuable purposes.

1: Easier regression testing. 2: Normal Windows users don't compile code. Ever. 3: Also, PJ64 defaults to debug instead of release, at least for me. So it's a three step process, not two. Plus the average user would struggle to find the compiled installer.

Downloading from github can also be very slow. I prefer to download zipped archives of the source, myself.

At the moment, the only way for normal, non-technical people to acquire the latest build of Project 64 is off EmuCR.

The question is akin to "Is it really too hard to put water, flour, and yeast in a bread maker then wait for three hours? Why would you buy pre-sliced, pre-baked bread that's already half-stale?"

cxd4 commented 9 years ago

1: Easier regression testing.

"More difficult" regression testing. Proper regression testing requires free access to the signature that caused the binary and not just to the binary itself. As well as the instant setup to compile that.

git checkout is faster and easier than going through the Internet to download a package/individual executable from a fixed date version. The commit name makes it easier too rather than random buildbot numbers, and it's faster to not connect to the Internet than to be downloading binary files and replacing the Internet cache / relocating them all the time.

2: Normal Windows users don't compile code. Ever.

Normal Windows users have problems understanding the easy and efficient way to do things. I'm not following your point here. If normal homeless people never think about how to improve their own lives, then does that just mean the problem is that other people should adapt to them?

Don't forget, only the updated/changed C sources are compiled, not the entire source. Building only from modified sources takes less than 10 seconds, which is more than an improvement over reading through massive buildbot lists and then downloading the EXE to an exact location.

3: Also, PJ64 defaults to debug instead of release, at least for me. So it's a three step process, not two.

It is way beyond even a three-step process if you count having to install Git, install Visual Studio, clone the repository to begin with, as well as all the other initial steps involved.

Of which, your problem with the default being "Debug" rather than "Release" is one of them. That's only an initial step. That's a one-time thing, like over half of everything else involved. Unless you are in poor practice of the Git commands, such as re-cloning the repository every single time.

Plus the average user would struggle to find the compiled installer.

No, because that would mean that the average user would struggle to understand Windows Explorer. Again, this really is more of a "one-time" point. If you were able to get past using Git to clone the repository to begin with, you're bound to get a good mental snapshot of where the compiled files are always going to be updated in every single time.

At the moment, the only way for normal, non-technical people to acquire the latest build of Project 64 is off EmuCR.

If it takes "technical people" to install a program named Git, install a program named Visual Studio, and open a solution file to F7 build it, then it takes "technical people" to install a program named Project64, install all the ROMs to it, and do so without making thousands of issues or questions and complaints on the forum. OH WAIT!

The question is akin to "Is it really too hard to put water, flour, and yeast in a bread maker then wait for three hours? Why would you buy pre-sliced, pre-baked bread that's already half-stale?"

I have already taught Lithium how to git pull to sync source changes, and he seemed more than happy with that method and has been using it as of late without such complaints now. The question I just posted was done so out of surprise, not condescending disposition for someone who didn't want to compile.

AmbientMalice commented 9 years ago

My understanding is this topic was always about people who either can't or won't compile. In my opinion, pre-compiled binaries are useful for regression testing because they eliminate possible user error. The end user doesn't need to know anything about the emulator in order to report that "Game X started misbehaving in a build dated June 3 Year X. Builds earlier than that are just fine."

In my view, @cxd4 you severely overstimate the savvy of the average user. I don't mean that in an insulting sense. Someone manages to comple PJ64. So they poke around in the source folder. There are a lot of folders in there. They find Bin/Release/ or Bin/Debug and then they have to know that they can't run "Project64.exe" because the plugins are missing. They have to run the installer instead.

project64 commented 9 years ago

It also really depends on the goals..

if it is to get more testers, it may help, but for beta testing generally want more involved people who can contribute back, so good to be able to test and feed back so using git and compiling is good for those users. The amount of feedback from the people who use the emucr is essential nil.

I would prefer directing people to the main site to download, maybe we can look at putting the auto build and put it on the site so people can always have the latest, but it can be potential confusion. Maybe have pre-release in the title till a version is officially released.

If we had a build bot, I would like it to be able to update version numbers if there is a change as well. EmuCr version is flawed in that if someone did actually file a report I would not know what actual code base they are using since it is all the same version number, the people who are giving errors from the source I can assume it is close to the current release.

cxd4 commented 9 years ago

My understanding is this topic was always about people who either can't or won't compile.

There is no "can't". Provide a scenario of where somebody "can't compile". There may be scenarios of where somebody can't install the latest Visual Studio on Windows XP or Windows 2000, but that does not preclude the legacy VC2008 solution that is still supported.

As for "won't compile". That is up to them. If they don't want the latest Project64 executable and are fine sticking with stable/official releases, why would a build-bot even matter then? If they DO want the latest releases but complain about not wanting to compile anyway then that is just stubborn self-delusion and really does not put the blame on anyone else.

In my opinion, pre-compiled binaries are useful for regression testing because they eliminate possible user error. The end user doesn't need to know anything about the emulator in order to report that "Game X started misbehaving in a build dated June 3 Year X. Builds earlier than that are just fine."

None of that is specific to pre-compiled binaries, and there is no room for "possible user error". Actually the only thing close to user error we've had was that you found a bug with Glide64 on the VS2013 build but not the VS2008 build. These are still necessary bug reports from people who do compile however, because zilmar doesn't use VS2013 for the official builds.

The end user also doesn't have to know anything about the emulator in order to see the date/time that the Git commit they just finished compiling was made to do what you just said: report that "Game X started misbehaving in a build dated June 3 Year X. Builds earlier than that are just fine."

There are a lot of folders in there. They find Bin/Release/ or Bin/Debug and then they have to know that they can't run "Project64.exe" because the plugins are missing. They have to run the installer instead.

The Bin/Release thing did confuse me first as well. Probably it would have been more self-explanatory if it was just Release/ and Debug/ folders at the top of the repository.

And why should anyone run the installer instead if all that does is extract the executable you just finished installing + all the adware you get from not compiling Project64 yourself? You have an installation of Project64.exe on your hard disk already somewhere, don't you? So just copy-replace that EXE with the one you just compiled. This keeps the plugins you already had installed.

Even if they got errors that plugins are missing, that would have happened if they had updated something about non-compiled Project64 anyway. If you see a plugin directory error, you change the plugin directory.... That problem does not apply exclusively to compiling Project64 yourself and is just another instance of idiocy.

cxd4 commented 9 years ago

All this crap about having missing plugins errors from compiling Project64 ourselves.

What logic indicates that it wouldn't apply to a buildbot system as well? All the buildbot would do is compile it for you and upload it. It can't magically arrange the directory structure for the main EXE and the plugins to go inline so that you don't have to run the installer (or b) use common sense and change the plugin directory or put the plugins in the right folder or just copy the EXE by itself to where the plugins were in a good folder). All of these are newb problems...it isn't specific to compiling.

Orphis commented 9 years ago

So, a little background first. I am both a developer, who has worked on various emulators (Dolphin, PPSSPP, Jpcsp) and developed the first continuous integration for emulators, as far as I know, in a fully automated way.

Early on, in Jpcsp development, we got a lot of feedback using the CI builds that were really essential to our development and we managed to eradicate issues for our users caused by bad packaging / build issues from websites like EmuCR. We also caught mistaked very quickly and had a much faster development speed than most other new emulators.

I've always provided a clean build from a known version of the source code and builds are always tagged with the version number, either a SVN revision number or a git describe, which gives us the git tag, distance to the tag and git hash. Users have learned to work with those very well and can report the proper version number when creating a new issue. It has shown to be more reliable than custom builds found on some forum post or random download link found in google.

At the same time, we stopped giving dumbed down instructions for non-devs on how to compile, and no one has really complained about it. In the end, users just want to help by using the latest version. They aren't tech savvy and will never understand "guides" on how to compile. The goal is to lower the barrier and allow people to test new changes faster. Also, remember that if you rely only on the same people for testing all the time, they will probably always test the same games and ignore a few others that you just broke with that change that was supposed to have no impact. You'll also be able to test on more hardware and drivers. For a while, developers on Jpcsp all had only Nvidia hardware. We would break ATI cards all the time. After creating the buildbot, we exactly knew what changed and could either revert it or push new changes to try to mitigate the driver issues. Imagine doing that only for official releases and the delay that you would have!

As for the "missing plugins errors", there are two aspects for making a buildbot. One is building, usually running make / msbuild / xcodebuild, super easy (for us devs who know what we're doing). The other is packaging the software. For this, you end up writing some scripts. Not terribly complicated, I've done it for many projects already. Create a folder, copy files, zip it, ship it. It can be either checked in the repository or inlined in the build script on the build server. In either way, you would have access to it and be able to update it when necessary.

More testers. Better testing. Automate all the things to reduce human error. Point users to "official" builds reducing the risk for them to run malicious code.

I see most of the issues that you raise a result of being afraid to change rather than real problems that need solving.

Orphis commented 9 years ago

Also, it's not necessarily an issue about people not wanting to compile, it's about repeatability. If everybody compiles with potentially different compilers their own version, their potentially unclean sources and the usual amount of Visual Studio issues, people end up testing different things. If you give all your testers the exact same version, you can be sure of what they tested and eliminate the usual questions such as "Are you sure you rebuilt? Are you sure you copied the new binary over the old one? Are you sure you properly updated all the files?"

cxd4 commented 9 years ago

"At the same time, we stopped giving dumbed down instructions for non-devs on how to compile, and no one has really complained about it. In the end, users just want to help by using the latest version. They aren't tech savvy and will never understand "guides" on how to compile."

This is your experience, and that's fine. I could draw a more optimistic conclusion than that because I know that it has worked and people have been able to follow them just fine.

The more we complicate the build requirements/process, on the other hand, then it can start to require "tech-savy" users to compile it without hazard.

Re: repeatability, Repeatability with downloading pre-compiled builds is similar to repeatability with compiling the source yourself. You follow the same steps. If you forgot to hit the compile button, that's a mistake, and mistakes happen. If you forgot to download and correctly replace a pre-compiled build uploaded by a build-bot (or did not completely re-install Project64 from the build-bot to be thorough and update all of the individual components), that is a mistake, and mistakes happen. Either approach requires effort, but compiling is easier when it is arranged to be straightforward.

We also don't want everyone testing the exact same build. C undefined practice takes its toll more so when built from some compilers than others, and it is thanks to that and what very little diversity in compiler choices for project64 we already have (which in some cases you might suppose to be a negative thing rather than a positive) that this lacking in quality can be fixed for better maintenance of more portable code.

project64 commented 9 years ago

there is a script already for most of the packaging ..

I was tempted to do a build machine my self, but there is somethings I wanted to do with it and did not find an easy script for it already/time to work out how to do it.

My thoughts would be:

the source on changes, update

Orphis commented 9 years ago

@cxd4 I just skimmed the surface of possible issues. There's probably many more. Human makes mistakes, and it's easier when they just have one step than multiple steps with prerequisites, such as having the right compiler installed for example, or opening the right project. There are multiple patch versions of Visual Studio, and there are multiple Visual Studio around too. There will be a new one soon and PJ64 might not work with it, but it will soon be the version that people install. From experience, I'd say that you probably talked only to the most skilled ones, and only few of the non-skilled communicated about not being able to follow instructions and just gave up on them. Everybody knows how to download and unarchive an emulator and that's what 99% of the people want to do. The other 1% are devs. But then, even devs in the projects I build (myself included) find it easier to download an older build rather than rebuilding it from source. It's way more convenient to do regression testing. Though, it may vary from project to project. How long does a full build of PJ64 take on your machines?

Also, you said that builds would be different. What kind of difference are we talking about? Debug / Release?

@project64 I don't think it's a good idea to change the sources on each build. Auto bumping versions to have a uniquely identifiable version number is probably not the best of ideas. It's probably easier to build the whole repository as a whole and package that with a unique version number. Consistency and simplicity is key. If you want to keep a different version for each component, you might just want to get the distance from the latest tag release to get a uniquely identifiable version number and inject it in the binary at build time. Git can help you with that.

I've been managing my company CI for a while (Spotify just so you know with hundreds of developers), and I have a great deal of experience with such systems. And I don't think EC2 is going to be the solution as it is really expensive if you leave the machine running all the time. You probably want at least a t2.small (really low HW, might not be able to build PJ64), which is $0.036 and hour, or about $26 a month. And that's without any storage, which will add up to the cost, or the installation of Visual Studio and other tools. A "proper" machine would be $0.266 an hour with a persistent local storage of only 32GB and total to $190 a month. Though, with such a machine, you could have something to turn it off for a while to save some cost and restart it later, but I wouldn't advise on doing that since the administration gets more complicated and will probably fail sooner or later. And of course, that's without accounting for any data transfer to the internet ($0.09 per GB after 1GB). It will be low, probably negligible though. EC2 is great is you have an elastic need. If you plan on having a machine running for a long time, you probably want to have a bare-metal hosted instance that you pay monthly.

If you'd like, I could add PJ64 builds to my site, I have some spare capacity on the servers I use and that shouldn't be much of a problem. You'll probably want to add git tags in your repository though to make it work nicely. Annotated ones please! You could evaluate it and we could iterate on it quickly if there's any issue with it.

cxd4 commented 9 years ago

@cxd4 I just skimmed the surface of possible issues. There's probably many more.

This really goes back to my original point from earlier, which is that compiling is easier when it is arranged to be straightforward. It depends on how much you have complicated the build process. I've seen projects where you don't have to do anything more than double-click a batch file to emit a compiled build. Obviously this would be easier than downloading things all the time.

So your point there revolves more around how individually complicated the build process is.

Obviously 1) double-click a batch file 2) done building! Doesn't so much apply to Microsoft Windows because Microsoft doesn't ship any compiler packages with a fresh installation of Windows, so you'd need a step 0) install X compiler. But that is more of an environmental issue.

There are multiple patch versions of Visual Studio, and there are multiple Visual Studio around too. There will be a new one soon and PJ64 might not work with it, but it will soon be the version that people install.

They can only use VS2013 Community edition, because that doesn't cost money and supports WTL/ATL. Anything else isn't free or doesn't compile Project64.

If what you were saying was really true, that they were going to voluntarily upgrade to the latest Visual Studio independent of build instructions saying to install VS2013 for compiling Project64, then that would almost certainly mean that they already have experience with building software and are not the average Windows user. And this issue with building inconvenience doesn't so much apply to them if they are experienced, now, does it?

and it's easier when they just have one step than multiple steps with prerequisites, such as having the right compiler installed for example, or opening the right project.

But they do have one step. There is no "right" project. There is just a solution file for VS2013. That compiles the 12-or-so projects all at once. There also is no such thing as "the right compiler"--good C is adaptable to many compilers. You're not supposed to write C code that's attached only to 1 exact compiler, otherwise it's hardly open-source and would only benefit from a build-bot because the code is too fixed to function allocation issues and tendencies of a certain compiler.

And your issue about people upgrading to the newest Visual Studio at some point and Project64 no longer working with it. That sounds rather pessimistic. Don't we want Project64 to work with the latest Microsoft compilers or am I wrong? Are you saying the buildbot approach can avoid that issue by always compiling with the same outdated compiler? These are issues that we want people to raise, not hide from them.

From experience, I'd say that you probably talked only to the most skilled ones, and only few of the non-skilled communicated about not being able to follow instructions and just gave up on them.

Nope, not at all. I'd say that your experience is pessimistic. I taught the not-at-all-skilled ones how to compile Project64 as well as some of my own projects. They have not complained. It seems you'd rather I think that some of them probably gave up communicating about not being able to follow instructions, but that doesn't really make any sense. Up until then I've been harassed for a long time by these people for not regularly putting out binary releases. Now they have started testing again and provide feedback based on my latest source.

Everybody knows how to download and unarchive an emulator and that's what 99% of the people want to do. The other 1% are devs.

Really what 100% of the people want to do is to test the software. It's how they want to obtain it that becomes a lesser question--do they want to extract it from pre-compiled or build from source? From time to time, both of these become necessary, as there are things that you can only point out and test by compiling from source, as well as things that you can only point out and test from whatever outdated MSVC was used for the pre-compiled builds. But how they want to do it--compile or download--is not relevant. It's--which way is more efficient and direct. And, in some scenarios, compiling isn't any harder than double-clicking a batch script or executing "make" for Makefiles, which is more direct than going through the Internet and extracting pre-compiled builds all the time.

In the Microsoft world, of course, like you said 99% of people have easily become fixed to relying on pre-compiled binaries and extracting them all the time rather than associating with simple build habits, but this really has nothing to do with what's right or wrong and is really more an issue with how people were brought up on certain environments.

But then, even devs in the projects I build (myself included) find it easier to download an older build rather than rebuilding it from source. It's way more convenient to do regression testing. Though, it may vary from project to project. How long does a full build of PJ64 take on your machines?

As far as regression testing you may have a point. Many times it's faster to just download an older build, than it is to git checkout to an older commit and re-build. That itself, however, builds only the updated sources (those rewinded from the checkout) and really does not take near the time of a full PJ64 build. It also depends on how much of a jungle it is to find the Git commits/their names to get a better clue of the names of the commits or code changes, which unlike build bots give more information on what got changed. Since the GitHub site is necessary to see the actual regressions in code form anyway, which makes the compile approach at regression-testing convenient as well.

Also, you said that builds would be different. What kind of difference are we talking about? Debug / Release?

Nothing like that, just that there is a legacy VS2008 solution file included. People with non-free trial edition of Visual Studio 2008 can possibly compile that instead. Many official builds used that, historically. In fact we found a Glide64 regression that crashes and segfaults at a certain game if the plugin was compiled by VS2013 instead of VS2008--it is thanks to that and the fact that the person was able to compile the source for themselves with something other than the outdated version of MSVC used to do the occasional pre-compiled releases on the forum that we were able to spot and fix this undefined behavior C malpractice.

project64 commented 9 years ago

@project64 I don't think it's a good idea to change the sources on each build. Auto bumping versions to have a uniquely identifiable version number is probably not the best of ideas.

yes I have done enough CI build to do that, mostly have the build machine create and insert versions at build time, yes could probably inject the hash of the the git version as well to be more accurate what is included in what version.

I like to have control over the build and how things are being done which really is the only reason I would like to do it instead of just letting someone else take it over.

with the amazon thing, really I could run it on the same server that runs the pj64 site, it is actually a windows 2003 box, if my main issue was money. I was just looking at separating things so it only had to do with building, there might be other shared hosting that works out cheaper.

Orphis commented 9 years ago

@project64 No, that won't work as VS2013 doesn't run on 2003. It's 2008 minimum. The point is: if you can derive the information by looking at the git history directly, then you don't need to make changes in the code. If you don't need changes in the code, then you don't need special handling there and there to take into account those commits so you don't end up in a loop where the version is bumped because the version is bumped because the version is bumped... If it's simpler, it's more robust and way less work to maintain.

If you don't have any external script committing to the repository, should the bot checking for changes continuously go down, your project will still be consistent and have proper version numbers all the time. It's not about having someone taking over, it's removing that process altogether.

@cxd4 This holds true as long as Microsoft doesn't release VS2015, then you need the right version and make sure that it works without regression. The problem is making sure that people use the same version all the time and if they change the compiler, it's done in a controlled way so that you can precisely pinpoint that the issue was caused by the compiler change. Users may have an older version of Visual Studio that is buggy. They may never upgrade. They will build. It will fail from time to time. If a user doesn't have any genuine programming knowledge, they shouldn't have to build. Saying they can still do it using some complex way is not doing them a service. Also, the rebuilding approach works, as long as you don't have any bug. It's quite common that Visual Studio looses track of a dependency and doesn't rebuild everything that should be. For a proper dev, it's not an issue, we are experienced and know how to deal with that (most of the time...).

project64 commented 9 years ago

@project64 No, that won't work as VS2013 doesn't run on 2003. It's 2008 minimum.

true but the version created with VS2013 has an issue running on windows XP and below I have spent the time to work out why even tho the settings are there to run it on XP (does run just crashes)..

So the release version is done using VS2008, so that is not an issue there.

if I have a build bot in charge of version, then I have a fake version number in the source, like 0.0.0.0, more likely 2.2.0.9999 then before it builds it change that to something that it controls, maybe update a variable with the hash of the git in there.

With svn I used the checkin id use that for version number, with git I just have the hash of the current checkin, there is no way for me to easily tell if one commit is before or after the other one. It is also useful following the windows versioning of x.x.x.x because I can look at that in windows explorer with out having to run it. So I would like that to stay in a sequence, I have used jenkins or bamboo to use its build index to use that for its final build number.

@Orphis do you have a way of being able to generate a sequential version number using git that can be removed from the build machine so that interchanging build scripts will produce the same result?

cxd4 commented 9 years ago

@project64 No, that won't work as VS2013 doesn't run on 2003. It's 2008 minimum.

Don't forget that not everybody uses the same compiler, or should even be required to do so. No one says project64 does VS2013 releases of Project64.

What I said is that today's build instructions simply suggest obtaining VS2013 Community Edition as the only simple option for beginners, because it's free but still supplies the ATL libraries. Nobody has to install that exact compiler, particularly if they have already paid for an older or a newer version of VS.

@cxd4 This holds true as long as Microsoft doesn't release VS2015, then you need the right version and make sure that it works without regression.

No, it holds true "after" Microsoft releases VS2015, not just before: after too.

We were talking about inexperienced users/developers right? Well according to you, "If a user doesn't have any genuine programming knowledge, they shouldn't have to build." So, given what you just said, what cause is there to think that they would install anything but the compiler suite recommended for beginners? If it's stated that VS2013 is the only free option that will work, why would you be so concerned about VS2015 if they were not recommended to it? Remember, we're dealing with people who don't even know what a compiler is or where to get Visual Studio from off the web. All they care about is getting the emulator. If they contradict the recommended instructions, they do not fall under your category of inexperience and are liably familiar with the Visual Studio environment already to contradict the recommended directions.

By all means, if they're experienced, they can contradict the instructions and install VS2015 instead of VS2013 which was free. This improves the amount of cross-compiler testing we receive for better notice of portability problems on other compilers or versions of MSVC.

And all of this is besides the point. If part of the Project64 source is written in a way that doesn't work with VS2015 or something older than VS2008, then we want users to raise this to our attention so that somebody is aware of this code defect. The build-bot approach nearly prevents this help from occurring because most people would just assume that only one binary result can be correct and download the pre-compiled result all the time, which was tested to work merely with one single compiler.

The problem is making sure that people use the same version all the time

Not at all. The problem is allowing people to NOT have to use the same version all the time. It sounds more as if you come from a closed environment where something is meant to be either a) available on a single target, b) not meant for portability to other platforms, c) not intended to work on anything but a single exact compiler, or d) closed-source or proprietary conditions. These are awfully big assumptions to impose on other projects in the open world. You say the problem is assuring people use the same compiler and compiler version; obviously this is no problem at all. When it is a problem or when a new version of VS comes out, then we want the source to be adaptable and to stay that way.

It is better to impose safe C practices in the code, not hope that a single old/new exact MSVC version will guess all the undefined behavior away at the right level that works for you on that particular compiler.

Furthermore I'm really not sure what anticipations you had in mind for VS2015 coming out and possibly breaking Project64. Project64 has been developed for nearly a decade in VC 6.0, then in MSVC 2008. The switch from VS2008 to VS2013 was virtually instantaneous. What reason is there to think that VS2015 would be such an inconvenience at this point? If you have experience suggesting that there is a reason, than that experience comes from unfavorable conditions such as code plagued with portability problems, even between different versions of MSVC, never mind other compilers that have surpassed MSVC.

Orphis commented 9 years ago

@project64 For the XP issue, have you tried building using the vs120_xp toolchain? That's all that is required to target XP. Or you can use the regular toolchain but specify some define that will fix some winsdk structures for XP compatibility if I remember correctly.

As for the version number, you want to have a look at "git describe" documented there: http://git-scm.com/docs/git-describe. v1.0.4-14-g2414721 => v1.0.4 git tag, 14 for the distance to the hash (patch version), g It's quite readable and you can usually reason about it easily. It's also used by all other projects and I have never heard any complaints about it. You'll need to push an annotated tag to make it work in your repo. And version numbers are global to the repository, so it might be a good opportunity to have a unique version number for all the plugins. Also, is it really necessary to have plugins anymore?

@cxd4 Because it's possible doesn't mean it's a good reason to do so. It's possible to compile, it's not reasonable to ask people to do it as they don't care about customizing the build or making changes. They just want a binary that works. Compiler have bugs, newer versions have some bugs fixed and some other are introduced. Today, I was asked to upgrade some compiler from VS2013u3 to VS2013u4 because of a bug. It happens. And when you want to test something, you certainly want to have a similar environment when building, or you're going to test different things that might exacerbate different bugs.

Also, if you have an issue with VS2015 building the emulator, do you really want users to notify you or do you want developers? I have a habit of not trusting my users, and for good reasons. I don't want to spend time debugging their build issues. Having something foolproof is not an option, but a must nowadays for any project, small or big. Developers should be able to use whatever they want and are comfortable with. Users? I really don't agree.

cxd4 commented 9 years ago

@cxd4 Because it's possible doesn't mean it's a good reason to do so. It's possible to compile, it's not reasonable to ask people to do it

Nobody is actually "asking" them to do it. Let's not add unnecessary/inaccurate restrictions here. This is their decision to make--not yours, not mine.

They can either use the occasional, once-in-a-while 2008 releases, or they can install VS2013 and give it a quick F7 to compile. Or a third option: Several people really don't worry as much about it as you do. They care just enough to take it or leave it, if anything use the 1.6 version.

Developers should be able to use whatever they want and are comfortable with. Users? I really don't agree.

You don't agree that Users should be able to use whatever they want and are comfortable with?

Well, that can't be helped. No matter how many restrictions you place on the user, they always have at least two options: Use the software or don't. Of course there is a slightly bigger multitude of options than that in project64's case, see points above.

It also is really not a restriction to leave them to double-click a build script in Windows Explorer to create the build for them and the like. This opens the way to faster acquisition of binary release builds produced natively on their CPU without having to go through the Internet, as with your method. The re-build system also is nothing like Microsoft's buggy makefile system, where you have to clean the make setup to fix VS bugs for certain, complex-enough projects.

it's not reasonable to ask people to do it as they don't care about customizing the build or making changes.

No no, this is not about what they "care" about. It's about what's best and what's easier.

Also, bias against coding or programming doesn't excuse the users from simple habits. Users not caring about riding a bike instead of walking 2 miles on foot doesn't make it "unreasonable to ask people to do it", as you say. It is perfectly reasonable to leave them to build from source, when it is so much simpler. Your experience suggesting otherwise is closed to the Microsoft environment.

You say later on that you "have a habit of not trusting my users". Is it really that important to trust their judgment of what's easier? Or what you think they care about or don't care about? For the same reason you should convince inexperienced users to rely on build-bot downloads rather than just quickly execute a GNU make command or open a batch file that does all the [re-]building to get the build even faster? None of these require any technical skill.

Compiler have bugs, newer versions have some bugs fixed and some other are introduced. Today, I was asked to upgrade some compiler from VS2013u3 to VS2013u4 because of a bug. It happens.

In general there are no language conformance bugs in today's compilers in this modern era. Think outside the box: The Microsoft compiler tool chain is certainly not the center of ISO conformance. Whatever discrepancy that provoked an upgrade from u3 to u4 is not something to blame on the lack of a build-bot: This is always controllable by the code itself, especially if you really did isolate the internals of the problem, which is the developer's job partly.

And when you want to test something, you certainly want to have a similar environment when building, or you're going to test different things that might exacerbate different bugs.

Yet again, we miss the point. This effect is desirable. You imply that it's not desirable because you are focused on one individual target and/or compiler. That is not a possible solution with multi-target or cross-platform software, in which case safe C practices and traditional behavior is encouraged. There are no "bugs" that a mere change of IDE or environment would exacerbate with such proper practices, excluding several of Microsoft's deviations from the standard which also can still be taken into account and controlled by the code.

Also, if you have an issue with VS2015 building the emulator, do you really want users to notify you or do you want developers? I have a habit of not trusting my users, and for good reasons. I don't want to spend time debugging their build issues.

Why should it matter who notifies us?

If someone reports that they just tried out the new VS2015 and spot a bug, whether it is a developer or a user reporting the situation is irrelevant--we always reproduce the situation on our own machine if possible to analyze the bug in the end result either way. However, if everybody always uses the same exact compiler, or uses your build-bot produced builds which use the same exact compiler, nobody is probably going to discover these defects.

Lithium64 commented 9 years ago

@cxd4 @project64

I'll say what I think, one buildbot would be more comfortable and practical for all people, I'm taking up almost 20GB of space on my hard drive to build an executable of 1,23MB, all other famous emulators like the dolphin, PCSX2 , PPSSP, DeSmuMe among others already have a buildbot, I see no plausible and justifiable reason that would prohibit Project64 also has a buildbot.

cxd4 commented 9 years ago

I'll say what I think, one buildbot would be more comfortable and practical for all people,

Well no. It has no effect on users, just testers. Testers who constantly want the latest version to test. Even then, obviously, uh that's what Ctrl+Shift+B is for? :)

What's "practical" for people is a simple, one-dimensional measurement. What users really care about is stable releases. Your involvement of Orphis into this discussion would have no effect on such users.

I'm taking up almost 20GB of space on my hard drive to build an executable of 1,23MB,

The only thing that takes up nearly half of that much space is the installation of Visual Studio 2013 and its components. That is a problem with zilmar's choice of compiler and repository layout, not the property of compiling from source itself as opposed to build-bots. That's an environmental issue.

all other famous emulators like the dolphin, PCSX2 , PPSSP, DeSmuMe among others already have a buildbot,

All famous emulators care what users think. Because: this is necessary for the "famous" qualifier.

So why do you think that is?

AmbientMalice commented 9 years ago

What users really care about is stable releases.

Except it's a really bad idea to download stable releases of Dolphin, Desmume, PCSX2, and so on, when improved nightly builds exist. In fact, Dolphin's download button directs users to the latest build.

As an example, Jet Force Gemini runs quite badly with CF=2. I made a pull request, but it wasn't merged in time for 2.2

So until 2.3 releases, every person playing JFG will either have to suffer bad framerates, download from EmuCR, or manually change the CF setting which has been deliberately made hard to find thanks to Advanced Settings.

Or they could download a nice, shiny, official nightly release to tide them over until the next stable release.

cxd4 commented 9 years ago

Only when we say "users", I'm distinguishing between testers and users.

At which point, users don't care about nitpicks or Dolphin's way of doing things; they just want stable PJ64 and sometimes for their purposes are even happy enough with versions older than 1.6. Even outside of that, you would have a hard time convincing them of the logic that it's a "really bad idea to download stable releases" instead of nightlies. Maybe you know an emulator project or two where that applies, but outside of emulators or Dolphin there are plenty of projects where it does not apply.

Now with frequent or regular testers, the only way to test a commit to the repository is to pull it and build it yourself, as nothing else makes sense. Beats having to download someone else's binary, which may or may not be related to the actual code change on the C level, even if it's official or "shiny". That being said, I don't want to end on too much of a sour note, as if build-bots are obsolete, useless or should never be applied. It's not so much that I would object to anyone else's project applying them; just that I'm pretty sure it's really more of an optional solution to mental laziness than it is a mandatory necessity.

fallaha56 commented 8 years ago

sorry just reading this several months later -i am DEFINITELY someone who wants to help test but really would need a buildbot to help me do this

otherwise, as other commentators have said, it's the EmuCR bot that sometimes just plain screws it up...

project64 commented 8 years ago

just some notes about getting version numbers: http://programmers.stackexchange.com/questions/141973/how-do-you-achieve-a-numeric-versioning-scheme-with-git

git describe --long --tags --dirty --always

v2.5-0-deadbeef ^ ^ ^ SHA of HEAD
number of commits since last tag

last tag

JunielKatarn commented 8 years ago

Hi. Regarding setting up / obtaining a build machine, I believe Microsoft has released a service which can provide CI for Visual Studio applications.

https://www.visualstudio.com/en-us/get-started/build/hosted-agent-pool

I'll take a deeper look later, but I believe it is free up to a certain build time usage. Project64 builds relatively fast, so it might be just enough.

JunielKatarn commented 8 years ago

Greetings, @project64 . After playing around with the tools I mentioned, I was able to establish a non-cost continuous build service, and to automatically deploy to any given ftp-accessible site.

Please take a look at http://hyvart.com/download/project64/ci/ I set up a very simple layout using date timestamps for each new build. Each directory will contain the executables in the Bin directory, and the DLLs in the Plugin directory.

That said, builds are triggered each time I pull commits into the master branch of my own fork repository (JunielKatarn/project64).

If you are interested, we could set up this triggers to fire when your own repo (project64/project64) gets new master commits.

Let me know what you think.

JunielKatarn commented 8 years ago

@fallaha56, please check the link I just posted. Let me know if it works for you, and/or if you have any feedback: http://hyvart.com/download/project64/ci/

fallaha56 commented 8 years ago

Hi Julio

Thanks for this, seems to work to download PJ64 exe and some plugins.

Many of the files in my current PJ directory are missing though eg RDB file –are these necessary/is it possible to include these to?

Either way thanks for doing this.

DF

From: Julio César Rocha [mailto:notifications@github.com] Sent: 01 July 2015 03:15 To: project64/project64 Cc: fallaha56 Subject: Re: [project64] Establish a buildbot for Project 64 (#256)

@fallaha56 https://github.com/fallaha56 , please check the link I just posted. Let me know if it works for you, and/or if you have any feedback: http://hyvart.com/download/project64/ci/

— Reply to this email directly or view it on GitHub https://github.com/project64/project64/issues/256#issuecomment-117400670 . https://github.com/notifications/beacon/AKgdQbaRU_2P2Oyyr966hTykTnmjvqP7ks5oY0S6gaJpZM4DsDOs.gif

JunielKatarn commented 8 years ago

Hi @fallaha56. I'm confused regarding what RDB file should be used. Is it Config\Project64.rdb?

Please let me know so I can include it in subsequent builds.

JunielKatarn commented 8 years ago

Nevermind. I have updated the deployment parameters. Try http://hyvart.com/download/project64/ci/20150704.1/ It excludes "legacy" plugins such as Jabo's, because they are not generated by the build process, so there is no point in republishing them (they can be easily obtained through the source repository). Added Config\Project64.rdb

Let me know if this works for you, and feel free to spread the word.

AmbientMalice commented 8 years ago

@JunielKatarn What type of buildbot are you trying to setup? First important point is that you appear to be building debug builds instead of release builds. (Also note PJ64 can build as 64 bit, but 64 bit isn't quite functional yet.)

Building PJ64 will spit out a bunch of disorganised files and also a self-contained installer -- installers really aren't ideal for build bots, though. However, if you install PJ64, you can take a look at the folder structure and figure out where each piece of the emu is meant to go. (You can disable the installer in your builds if you wish.) Then write a script to reorganise the components, then upload it. You may wish to compress each new build into a single archive for convenience/bandwidth.

I'll take a look at PJ64 later today and draw up a "where things go" guideline, if you're having trouble.

AmbientMalice commented 8 years ago

@JunielKatarn Without an audio plugin, (Jabo's) PJ64 can't function. IMO, a build bot that produces non-functional builds isn't particularly useful for general use. Of course, this will be less of an issue once Azimer's is merged into PJ64.

JunielKatarn commented 8 years ago

@AmbientMalice It is still a work in progress. Builds can be triggered both manually or each time my GitHub fork gets a commit to the master branch. (Of course, if @project64 were willing to provide a read-only API key, they could be triggered each time he pushes new code to the main repository, which would be way better).

The installer is not included in my deployments, for the reasons you mentioned. I'm currently only building Debug configuration, because 'Release' is kind of broken (it can be built, by parts, but "one-step" building currently fails: https://github.com/project64/project64/issues/476).

Regarding ZIPs, I currently face a minor technical limitation in my infrastructure (trust me), but will get around it soon enough. My intention is to provide zipped content.

JunielKatarn commented 8 years ago

@AmbientMalice; So, do you strongly suggest I include all plugins, even if they are not generated by the build process?

If I get you correctly, you suggest to provide a usable archive as opposed to a "minimal" archive. Is that right?

cxd4 commented 8 years ago

(Also note PJ64 can build as 64 bit, but 64 bit isn't quite functional yet.)

It is where I'm sitting. :) 64-bit Project64 works fine with most ROMs after the established changes.

... once Azimer's is merged into PJ64.

no.

AmbientMalice commented 8 years ago

@cxd4 When I say "not quite functional", I am referring to the fact that PJ64 does not have a full set of 64 bit plugins since Jabo's is 32 bit. "Out of the box", x64 PJ64 doesn't work, AFAIK.

Also, I thought the plan was to make Azimer's PJ64's default plugin? It really doesn't matter how that happens. Some want to integrate Azimer's into PJ64, others want to keep it as a separate repo -- I consider both things "merging", since the results are identical to the end user. Iffy word choice, TBF.

I'm more worried about GLideN64. Building GLideN64 is significantly trickier than PJ64. It's gonna create a barrier for people trying to build PJ64 in the eventuality it becomes PJ64's primary video plugin.

cxd4 commented 8 years ago

"Out of the box" doesn't really count since it's a plugin-based emulator. There's no such thing as a plugin-based emulator that works out of the box because it requires plugins. There's also no need to statically merge all the plugin repositories into the same colossal Project64 repository because that would defeat one of the benefits achievable by means of having a plugin system exist in the first place.

Also, Azimer's plugin is 64-bit, so we lack no 64-bit plugin for any of the four plugin types.

fallaha56 commented 8 years ago

Fatcat, I think I understand your purist approach but the average Joe emu enthusiast (like me!) really does need a buildbot making a zip (or even installer) that has all base plugins and jst works OotB.

Perfect example for me is the PJ64 64bit build -I've got the exe but can't find 64bit plugins (let alone build them!)

Plus there's the fact that there's only really a handful of added plugins worth having: Azimer's, N-Rage, GlideN64 and last but not least your RSP and RDP ones.

Jst my 2 cents.

DF

Sent from my iPhone

On 4 Jul 2015, at 01:23, cxd4 notifications@github.com wrote:

"Out of the box" doesn't really count since it's a plugin-based emulator. There's no such thing as a plugin-based emulator that works out of the box because it requires plugins. There's also no need to statically merge all the plugin repositories into the same colossal Project64 repository because that would defeat one of the benefits achievable by means of having a plugin system exist in the first place.

Also, Azimer's plugin is 64-bit, so we lack no 64-bit plugin for any of the four plugin types.

— Reply to this email directly or view it on GitHub.

cxd4 commented 8 years ago

@fallaha56 that is not the subject we were discussing. Nothing I said was in opposition to having buildbots, just copying AziAudio repository inside of Project64 repository. Those two things are not mutually exclusive--you can have build bots without clustering all the plugins in the same repo as the Project64 core. None of that means you shouldn't have build bots, and maybe my "purist" approach comes across as something unnecessarily religious to people whom it doesn't directly affect, but it still affects them and it's still general wisdom.

That being said I'll answer your concerns anyway.

Could very well be that you are correct. I will say that if it's true that Project64 really could use (or even need) a build bot making zip releases, then it simply is because of the inflexible and overly fixated approach at its design. It still requires non-Express editions of Visual Studio, which consume at least 15 GB of disk space on everybody's computer who wants to install it.

It's also much faster to just double-click a build script or something in Windows Explorer (or at least have a Makefile and enter "make" on the command-line) than it is to wait for VS2008 to open up and execute the build command through the solution/project files in the GUI. Historically, building any software on Windows is simply harder than it is on many other platforms. Microsoft has no open-source build chains and does not include them in any lightweight form with Windows itself. Conversely, in many cases having something else like Linux installed almost unconditionally means that you have everything you need to compile Project64 with a single batch command--if the code were written to be portable outside Windows.

So the problem I believe is a mix of your inconvenience from having to install what you need to just compile Project64 yourself, and also the inconvenient and unnecessary requirements Project64 imposes on what you would have to install in order to build it yourself (outside of Windows, commonly nothing at all).

cxd4 commented 8 years ago

Oh, and the subject of my post wasn't "build bots shouldn't exist for PJ64"; it's "AziAudio should not be statically included in the GitHub repository with Project64". Those two are not mutually exclusive, so to address my opposition to having a build bot at this point would be to change the subject because that wasn't really what I was getting at with my last post.

Perfect example for me is the PJ64 64bit build -I've got the exe but can't find 64bit plugins

Don't want to build? Use public releases. 64-bit Project64 is happening whether zilmar likes it or not.

Besides the plugins aren't related to Project64; they're their own projects. You're supposed to find those from the plugin authors' downloads (if they provide them) and install them to Project64.