atari-legend / legacy

Source code for the legacy AtariLegend site (Still used for the CPANEL)
https://legacy.atarilegend.com/
GNU General Public License v3.0
3 stars 0 forks source link

GITHUB - FEATURE - Open source the code #425

Open nguillaumin opened 6 years ago

nguillaumin commented 6 years ago

We have open sourced the DB and we have discussed about open sourcing the code too.

We may want to consider that sooner rather than later, because that would allow us to use free Continuous Integration systems. Indeed, on Shippable that we currently use we're on the free plan but we reached the number of builds we could do in a month (150). As a result builds have been skipped since last week or so:

shippable

That's not great because we won't detect mistakes we may make in the CSS, JS or HTML files. I looked at the paid plans, but they're expensive ($25/mo).

If the repo was public / open-source, we could use the free Travis CI instead (which is what everyone is using on public GitHub repos).

Are there any obstacles from your point of view about making the repo public? From my point of view there's one thing we need to do: Removing all the passwords from the source files, for example: https://github.com/stgraveyard/AtariLegend/blob/master/Website/AtariLegend/php/config/config.php#L81

Some passwords also have been commmited at some point, but then removed later, but they will show up in the Git history: https://github.com/stgraveyard/AtariLegend/commit/2bba76415bded5782062c1824ec4f2ce33b9c2b4#diff-382081e52ccd3b61e63001c4f98eafa4L17 . That probably means that we have to change all the passwords in 1and1 before making the repo public...

nguillaumin commented 6 years ago

In the meantime we could try to limit the number of time we push to GitHub (as Shippable will trigger a build every time a commit is pushed). Perhaps just push branches when they're ready to be merged, rather than multiple time in a course of a week?

stgraveyard commented 6 years ago

I am all for open source, but I have to admit, I'm partly responsible for this. I have pushed quite a bit 'link section' changes and also my latest PSD designs for the Facebook page. Like you say, I'm sure I can change a bit and become more structured and only push once or twice a day (max). If we take this in consideration, this should not happen all too often. However, like I said, I'm all for opening up the code, but we have to be totally sure there is no sensitive data available and that people can not spoil our GITHUB code by pushing stuff without our permission (and things like that).

nguillaumin commented 6 years ago

That's ok, I don't think we should limit ourselves in pushing / committing, fast iteration makes for better development...

Sensitive data should be easy to remove, as far as I can tell there's only the email password in the code these days? (Since the DB connection details are in a separate file that is not checked in). We could do the same as for the DB credentials, put them in a separate file.

can not spoil our GITHUB code by pushing stuff without our permission

Yeah that's virtually impossible because push access is not granted to anyone by default, so we would have to authorize specific users if we wanted to. Contributors would just make pull requests like we do, and only one of us could merge them in.

stgraveyard commented 6 years ago

Hmmm ... I see only positive things :

Can we manage this for January, you think? Or am I too positive?

nguillaumin commented 6 years ago

The main point is to remove passwords from the code (shouldn't be too hard) but especially to change all the 1and1 passwords in the 1and1 admin console. We probably don't want to rush that and mess up :wink: so January may be cutting it a bit short.

stsilversurfer commented 6 years ago

Also, don't for get the sql files stored here, they contain the whole enchilada of usernames, email, password hashes and so forth.

nguillaumin commented 6 years ago

Hm which SQL files? I don't think we have any SQL files on GitHub (?), and the public DB exports don't include the users table.

To clarify, I'm just talking of making the GitHub repo public, we would keep complete control over the hosting and deployment of course.

stgraveyard commented 6 years ago

I think Mattias is referring to this :#208 ? (Which has become pretty uselesd now - I will remove it from my db account)

nguillaumin commented 6 years ago

Ah that! Right yeah we should remove that indeed.

stsilversurfer commented 6 years ago

https://github.com/stgraveyard/AtariLegend/tree/master/SQL%20backup

stsilversurfer commented 6 years ago

Can we add a jenkinsfile? We have an old server here at work that, though a bit flaky, isn't being to anything mission critical. I installed Jenkins yesterday on it and it seem to be running ok.

stgraveyard commented 6 years ago

https://www.coveros.com/jenkins-pipelines-jenkinsfile/

--> And I still don't understand a thing of it! :(

stsilversurfer commented 6 years ago

Yeah, I don't really understand how to doctor a jenkinsfile either. Hopefully @nguillaumin know how to.

nguillaumin commented 6 years ago

Ah damn, I missed these SQL backups... That will be a pain, the only way to completely get rid of them is to rewrite the Git history with git filter-branch.

For Jenkins TBH I'd rather not. I've used it a lot in the past and it's nowhere as good as Shippable or Travis-CI. Using it with Docker requires a fair amount of work (you'll have to install Docker in the first place which comes with its own set of problems).

But more importantly I'd prefer if we were not depending on a random server that may crash, or that people may shut down or reinstall because they don't realize it's used for AL, or that stops working while you're on vacation and we can't fix ourselves, etc (a bit like what happened with 1and1 last month). It's a lot better to use very reliable systems like Travis CI or Shippable that millions of people use, that "never" crash and that we don't have to maintain and worry about.

Does that make sense?

stsilversurfer commented 6 years ago

But more importantly I'd prefer if we were not depending on a random server that may crash, or that people may shut down or reinstall because they don't realize it's used for AL, or that stops working while you're on vacation and we can't fix ourselves, etc (a bit like what happened with 1and1 last month). It's a lot better to use very reliable systems like Travis CI or Shippable that millions of people use, that "never" crash and that we don't have to maintain and worry about.

Oh, I definetly agree, this old piece of junk is definetly crash prone. I discover atleast once every week or so that it has crashed on us and have to kick it hard in the nuts. 😄 My thought wasn't to replace shippable which works like a charm but rather having jenkins running as a secondary resource incase we run out of builds at the end of the month.

Ah damn, I missed these SQL backups... That will be a pain, the only way to completely get rid of them is to rewrite the Git history with git filter-branch.

Maybe it is easier if we just start a new public github repo?

nguillaumin commented 6 years ago

having jenkins running as a secondary resource

I see, yeah I don't mind that as long as it's not the primary one 😉

Maybe it is easier if we just start a new public github repo?

We could, but then we would lose the complete commit history, which I think is quite valuable to understand specific bits of code. You can just do git log and see why it was done like that.

git filter-branch is not especially difficult, but it rewrites the Git history so whoever is doing it needs then to force-push to GitHub, and the others need to re-clone the respository from scratch. It would just require a bit of coordination between us, but it's doable I think.

stgraveyard commented 6 years ago

Guys, let's just do this step by step, cleaning up stuff as we go and re-evaluate next month. No problem from my side, as long as we keep THIS git. It is really important to me to keep the history of it all.