bitboxer / do-not-harm

The do not harm license
1 stars 2 forks source link

But mass surveillance doesn’t harm people, still it’s evil. #1

Open bitboxer opened 10 years ago

bitboxer commented 10 years ago

By @moonglum :

But mass surveillance doesn’t harm people, still it’s evil. 
Especially because I work for a DB company I think 
about it a lot.
bitboxer commented 10 years ago

Yes, this is true. But I wanted to have a license that specifies what I mean with evil. "Evil" is a cultural thing. Some think A is evil, some say it's perfectly normal to do.

And I would says surveillance harms people. Not physically, but mentally.

How would you specify what "evil" is?

schultyy commented 10 years ago

"Do no evil" depends on the eye of the beholder. When A declares something as non-evil, is this also true by default for B? I don't think so. Harm also is not 100% specific, but for instance if you narrow it down to physical harm, then this doesn't leave much room for interpretation. I also think that harm requires somebody who wants to use this license to think about what harm means.

moonglum commented 10 years ago

I absolutely agree that "evil" is very subjective – I'm not sure though if "doing harm" is less subjective. I see two different approaches here from you two:

  1. Define it as 'physical harm': This is pretty clear. I would say that there is not a lot of room for interpretation in this definition. But I would also argue that it's really, really hard to harm someone physically with software that is not installed in weapons. And in all software that I have build in my life this has never been a concern for me as I never build weapon systems. I would also add that physically harm is illegal in developed states as long as it is not sanctioned by a government. So this is only relevant for cases where your software is used as a weapon by a government. This is a pretty narrow case.
  2. Define it as 'physical and mental harm': This is not that clear anymore. I think it has as much interpretation as doing evil. "And I would says surveillance harms people. Not physically, but mentally." – I would say that about the same amount of people would disagree with that statement than with the statement "surveillance is evil".

In our day to day work we mostly collect data. Either directly when building a web application or indirectly when building a database or a gem/open source application. I think we need to develop some kind of moral how we treat that data. I deeply believe in 'collect as little data as possible' (see also my comment on Bodo's post) and still hope that more people would agree on that. Collecting data rarely does physical harm, but it does a lot of mental harm or – as I would call it – evil. Also see "Mike Monteiro - How Designers Destroyed the World" as an example.

I also think that harm requires somebody who wants to use this license to think about what harm means.

I hope this holds true for "evil" as well, or do you see any difference there?

bitboxer commented 10 years ago

I see a difference because there are lots of people out there who would consider killing a human without a trail not evil. For example the USA drone bombings in Afghanistan.

Can we put the "only collect data you really need" into a license? Sounds like an interesting thing. The best example would be the collection of religious and sexual informations before the WW2. That made it way easier for you know who to kill all those people. (And now I reached Godwin)

bitboxer commented 10 years ago

btw: the pretty narrow case that government is using code as a weapon is exactly what I am aiming for. Could I be a Linux Kernel dev knowing that my code runs inside a Predator Drone? Don't think so.

moonglum commented 10 years ago

iOctoct just lost my entire comment when I switched the app :crying_cat_face:

moonglum commented 10 years ago
bitboxer commented 10 years ago

Yes, a Hippocratic Oath would be awesome for our profession. How should something like that look like? Maybe it's time to start something like that?

And yes, war is different. But this does not mean you can ignore ethics when you are in war. For me the drone strikes are highly unethical. And I want to have a discussion about that in the it community. But currently there is no discussion at all. This license is just a vehicle to get this started :wink:

For me there are questions like:

moonglum commented 10 years ago

Oath

I think an oath would be nice – getting people to 'sign the oath' by requiring them to sign it when using certain open source libs/gems/... – and people that sign the oath then release a new open source lib/gem/... which again requires people to 'sign the oath'. It's like a virus :wink: I think the agile manifesto demonstrated that programmers are willing to 'sign' something that describes the way we work. I never saw a license doing that in a similar way. What do you think?

The oath should talk about the responsibility you have as a programmer[^studying]. One of the biggest responsibilities we have (especially as 'web programmers') is the data of our users. We should treat the act of someone providing us with their data as a big "I trust you with my data" statement. Whenever we store data, we should tell our user:

  1. What data are we storing for them in our database? This includes the way we store their passwords. We should make it very clear that whenever someone breaks into our server (which we will never be able to prevent 100%), all of this data will be available to the criminal (or agency (let me reduct that to criminal again)).
  2. What data are we making accessible to whom? Is it accessible to the public, to other users of the product (if so, can I configure to whom?) or to people the company has contracts with (for example advertising companies or users of our API)?
  3. What data could be deduced from the saved data? As programmers we should understand the value of meta data and how we can use graph theory and other science to deduce more data from that. I understand that we will never be able to answer this question completely, but if we make it apparent what we store then others can jump in and make remarks of what is missing in this section.
  4. What will happen to the data when we get bought by a different company? Will the statements we made in our declaration still hold? If these statements change, we will guarantee to let you review your data and remove it before the change takes place.

[^studying]: I think that this is one of the things that I would love to see change in the curriculum of computer science (I'm no longer convinced that it should change and teach "more programming" – it shouldn't do that at all): Teach the students to take responsibility for the things we create.

Contributor Covenant

I also really like the idea of a contributor covenant. This is one of the things I want to do some research on and propose it to a bunch of projects I work on. Angular.js has one for example. I think that is really important for things were UGs can form around (like frameworks or databases).

War

That was one of the remarks that got lost, when I rewrote my comment :crying: Sorry! I meant to say that this is used as an argument, not that I agree with that argument in any way. War (as well as terrorism...) are used to justify horrible things that should never be justified in my opinion.

Taking responsibility for your software

That is a very hard problem. Some scenarios:

It is especially hard to say were we draw the line here. I believe that everyone that thinks that LBGT people should not exist/be allowed to marry/... is either a terrible person or really, really misguided. Should I forbid them from using my database to build their Prop8 website? What about people that don't have the same political opinions that I have? Misogynists? I think we should not be able to forbid that as authors of free software.

Responsibility for the company you work for

My opinion is very clear on that: If your company does something you don't agree with, you should state your opinion very clearly and try to convince them to change that (not only as a programmer, but as a person in general :wink:). If that doesn't work you should quit. After that:

  1. If the action is clearly illegal, you should sue them.
  2. If the action is immoral but not illegal, you should probably blow the whistle. Your contract is probably forbidding you from doing that, so you should talk to a lawyer beforehand so that you are aware of the consequences of doing that.
bitboxer commented 10 years ago

Just a little heads up: I haven't forgotten about this. Will answer in a longer form this weekend.

bitboxer commented 10 years ago

Oath + convenants

I really like the idea of an data oath. And we have lots of demonstrations where it worked. The rules you pointed out are really good. The contribution covenant is a good addition to this. Both raise awareness and show a direction where we should go as a community. Maybe we could formalize both in a page that explains both and try to advertise this to more people? What do you think? Should we create a new issue for this here and discuss what the texts should look like? What should be included in them?

Responsibility

My problem here is that most developers don't think about responsibility at all. And lots of those people say that weapons kill people and should not be exported or sold openly to anybody. I can totally understand that this is a hard issue. If you don't want to have people "harmed", what about unborn babies? Just to name an extreme problem you could run into.

And yes, I don't want everyone to use a license that forces a certain ethic value upon the users of the software. I totally understand that this is impossible.

But what I want is that developers should think about ethics. About consequences that come from the stuff they are developing. Take a stand when your company does bad stuff. Luckily this is easy for us if you compare this to other professions. You will find a new job. Even if you are a whistleblower against IBM. But sadly I don't see enough of those. I heard lots of "war stories" of developers in banks and the telecommunication sector doing horrible things. And nobody there took a stand.

I would love to see an ethics 101 as a mandatory course when you study computer science.

We as software developers should have more talks about ethics on our conferences. Currently I am in "investigation mode" and talk with lots of people about this. Maybe I can create a talk from this in a few month. Right now I don't think I would do a good job. Ethics are hard. And talking about them is even more.

bitboxer commented 10 years ago

Elon Musk did a press release yesterday that said all the Tesla Patents can now be used for free when used in "in good faith". That leads to the same problem I had with the do no evil license. Would it be in good faith to use the patents as Ford to change the complete lineup of cars to be electric in 2 years? Yes? And If that ruins Tesla? Would this change if it cruses Tesla on purpose or without purpose? Hm...

moonglum commented 10 years ago

Oath

I think we should do that. The longer I think about it the more I think it makes sense. Big companies:tm: will probably never do that, and we as developers have to move as much as we can. Because what has to happen is that people realize what they are trading when they for example get a Gmail account. That doesn't mean that this product is evil and no one should use it. It just means that everyone understands that they trade something for this 'free' product. I tried to explain this in the last Nerdkunde and I think I failed miserably.

Responsibility

I agree that we should teach more about ethics. Maybe you should also take this into consideration with your Rails Girls Summer of Code coaching? :smile:

Tesla

Hm. I'm not sure how you could use that for evil. Even if you use it for tanks: It would still save the environment, would it? :wink: And of course another company could totally run over them, but the tone of his release is more like: 'Come at me, people' so I guess he doesn't care about that :wink: