iperov / DeepFaceLive

Real-time face swap for PC streaming or video calls
GNU General Public License v3.0
26.29k stars 4.49k forks source link

Stop developing this technology #41

Open alonsoir opened 2 years ago

alonsoir commented 2 years ago

This technology is only going to be used for evil purposes. To deceive people, steal their money, possessions, undercover operations. Nothing good. It is already being used for scams like the pig-butchering plate, in which hundreds of thousands of lives around the world are being left without money, causing a lot of suffering.

HWiese1980 commented 1 year ago

@OpakAlex Right. There's no chance we will ever know whether a world without the wheel would have been better or worse. The only choice that we have, which is no actual choice at all, is to live with the fact that it has been invented and try to learn how to avoid people getting harmed as a consequence of the invention. Same applies to each and every other invention that is potentially harmful - and that's pretty much everything that we have. Life is inherently dangerous and inevitably deadly. We can't save everyone. We can just try to mitigate harm and prolong life as much as possible. And who knows? Maybe the technology behind deepfakes one day helps curing cancer or whatever, just like the wheel made many, many, many things easier and better, at the cost of tanks and occasional accidents.

iperov commented 1 year ago

People who were not visited by the stress of new technology were left to live in the Stone Age. Such tribes can be found in Africa. They still chase animals with spears.

OpakAlex commented 1 year ago

@iperov Can we ask they if they are happy? And ask people which we bombed for Democracy how happy they are.

MiroslavPetrik commented 1 year ago

People who were not visited by the stress of new technology were left to live in the Stone Age. Such tribes can be found in Africa. They still chase animals with spears.

Like in the Avatar movie 😍

MariusPalys commented 1 year ago

Problems are there to be solved, not avoided. We have to learn as a species from the tech that we develop, instead of avoiding it. Many people died in car and bicycle accidents, killed by tanks. Should we not have invented the wheel in the first place? Every invention can be abused, every invention will be abused. We do not avoid that by avoiding the invention.

Please name one thing this tech/repo would be benefitial except in deception and scams?

OpakAlex commented 1 year ago

Please name one thing this tech/repo would be benefitial except in deception and scams?

We can see porno with any face ;)

HWiese1980 commented 1 year ago

@OpakAlex I bet they are happy, as happy as you can be from their point of view. They probably have their own issues, but I could imagine them to be significantly more happy than we are because their society is much simpler. I don't know though. I've never met one of them.

MariusPalys commented 1 year ago

We can see porno with any face ;)

I'm 99% sure this is the reason the whole development has started. Still an enormous point for the contra side of the list...

OpakAlex commented 1 year ago

@MysteryPancake But it's a big market, with this you can do so easy fakes, call even parents, or ask friends for money, as they can see you on video call. Right? Also a lot of things, put this one on TV, with some presidents, and you can push revolution so easy. I think of point where is our society we need like this, to destroy this world, and come back to time of Stone Age. Why not?

HWiese1980 commented 1 year ago

@MariusPalys Plausible deniability. Leaks like "The Fappening" will have much less of an impact if 99% of the material out there is convincingly faked anyway.

Furthermore, not just porn. It is now theoretically possible to produce movies with stars that aren't among the living anymore (see Star Wars Rogue One for instance). That's a new kind of business model for the surviving dependants.

Faultless commented 1 year ago

The problem is that competent people with high intellect have an unsupervised, unrestricted capacity to take us to the dystopian future we are all dreading. None of them will stop one morning and wonder about the ethical ramifications of their work, be it deepfakes or diffusion models. Unfortunately, they can always hide behind the "you're stopping progress, hah! How foolish you are with your lack of domain knowledge, going against the current" position. For this particular tech, we will just have to wait for a big enough scandal to put the lid on it, regardless of the fuming tech-bros having seizures over how unfair and anti-science it is that they cant make revenge pornographic content legally anymore.

HWiese1980 commented 1 year ago

@Faultless True, however, by making the technology available to the public, the problem of monopolization is mitigated. There are simply no distinct "competent people" anymore who could steer us all into the abyss. If we jump, we do it together.

chervonij commented 1 year ago

@MariusPalys

Please name one thing this tech/repo would be benefitial except in deception and scams?

Media and entertainment, at least. Streamers, bloggers, some live events etc. If we talking about deepfakes in general, there is much more.

If you can't see technology as anything other than deception and scams, then you just have a limited imagination.

OpakAlex commented 1 year ago

@chervonij But when eco guys protests again atom stations, we say they are heros, what is different if we say stop this project? I can tell you what: PROPAGANDA.

chervonij commented 1 year ago

@OpakAlex What are you talking about anyway? What eco guys? I don't know who calls them heroes, but anyway, they can do whatever they want, I don't care.

MariusPalys commented 1 year ago

@chervonij so how is impersonating an other person/catfishing not deception?

OpakAlex commented 1 year ago

@MysteryPancake he just dont care ;) "I don't care"

HWiese1980 commented 1 year ago

@OpakAlex Propaganda is a big problem, yes. As always. And deepfakes will become just another means of propaganda too. I say, it's all going to level out at some point. Like everything in nature, society will always strive towards some sort of equilibrium. On one hand, propagandists will gain new tools. On the other people will learn that there are new tools and not to trust them anymore. I think, the current relatively small level of conspiracy bullshit roaming the world will eventually explode so much that it will eat up itself. You simply can't maintain 8 billion conspiracy theories of which 99.9% contradict each other.

Or in short terms: we can't do much about it anyway. So I'll stick with "everything's going to turn out alright".

chervonij commented 1 year ago

@MariusPalys

It is only a deception if you claim to be that person.

HWiese1980 commented 1 year ago

@MariusPalys It is. No question about it. But because of the mere existence and overwhelming danger of it, we will learn new and hopefully better ways to deal with it. That's the point. We learn from challenges. We wouldn't have this challenge if we had avoided the development of the technology in the first place.

OpakAlex commented 1 year ago

@HWiese1980 your point so good, just stop development will not help. This or another repo / tech will come, i think society challenge now to learn how to use tools, and how to use tools for growing humanity, this is aa problem. Some tools we need to give to society only in right time. It's like example of Stone Age, if we go now to this society and would give them AK-47 as a tool to kill animals, we will see how they will use it to kill people to have more food, girls, etc. But if this society will grow and learn (not sure even if we learn dont kill people for things) then they would use tool right.

chervonij commented 1 year ago

My point is that you can't blame technology because of how some people use it. Any technology can be used for good or bad.

You can use Photoshop to deceive people, too. Why don't we ban Photoshop? Or internet in general? Because a lot of people have been deceived by it, too.

Most of you are just causing drama, without looking for a real solution to the problem. This is what happens when people spend too much time on Twitter.

MariusPalys commented 1 year ago

@HWiese1980

But because of the mere existence and overwhelming danger of it, we will learn new and hopefully better ways to deal with it. That's the point. We learn from challenges.

Ah ok, if only Kalashnikov or Oppenheimer would have seen it that way, then we would have learned from our challenges and live in peace. Right? Also this is quite a strange argument: "Look at this potentially evil tech we created. Now we have to learn a new way to deal with it. Good luck to those who can't lol"

But ultimately you are right: someone will eventually develop this tech. I'm just glad it is not me.

MariusPalys commented 1 year ago

You can use Photoshop to deceive people, too. Why don't we ban Photoshop?

You don't get my point from the beginning... At least I personally can't see a reason to use this technology for good.

OpakAlex commented 1 year ago

"But ultimately you are right: someone will eventually develop this tech. I'm just glad it is not me." +1 for me

GeorgeMD commented 1 year ago

When you have "mrdeepfake" listed on the page of your repo you can't make the point that "tech is neutral" and "if we don't develop it, someone else will". You know it's not used for anything good, you know it's actually used for something that affects people, yet you keep developing it and acting like you're not doing anything wrong.

HWiese1980 commented 1 year ago

@MariusPalys

Ah ok, if only Kalashnikov or Oppenheimer would have seen it that way, then we would have learned from our challenges and live in peace. Right?

Right. And we do. The last 70 years of relative peace were only possible because we held the gun to each others head, ready to pull the trigger anytime, knowing that they'd still have enough time to react and pull the trigger on us as well. Mutual Assured Destruction is the keyword. It would not have happened without the bomb. We'd probably still be fighting each other in pointless wars instead of getting together at a table to negotiate how to avoid mutual destruction. That's even the case in the ongoing war with Russia. If it hadn't been for the nuclear weapons arsenal on both sides, we'd have full blown WW3 now.

MariusPalys commented 1 year ago

@HWiese1980

So humanity bringing so much misery and destruction to each other that we literally have to stop or we would propably destroy the whole planet is the standard now? "Relative peace" also comes from a big privileged point of view even if you phrase it that way. But yeah, again, I don't think you are wrong. It is just sad to see that we can't get at a table to negotiate how to avoid desctruction without the preceding destruction. Be it in war or tech, like it seems. Because instead on working on counter-mechanisms you just say: "Humanity is fucked, we are just contributing our part to it, baddies gonna be bad so why shouldn't we be baddies as well."

iperov commented 1 year ago

It amuses me so much when people talk about how they can be fooled by deepfake. At the same time, they read massmedia every day, where the news is either deceptive or presented with a different context so as to give a different impression of the news. And I'm not even talking about the outright lies of officials.

JasSra commented 1 year ago

we should coin a term for something like AI-phobia or deepfake-phobia. lol This is the response from Chat GPT on what is AI related phobia called:

The phobia related to AI technologies is called "Technophobia." Technophobia is a fear or aversion to advanced technology or technological devices, especially computers and automation.

OpakAlex commented 1 year ago

@ianevski good, we will wait for a first loan which you can have just of call ;) Or your face on police camera ;)

devjev commented 1 year ago

In today's episode: scummy tech bros tie themselves in knots to pretend their porn & fraud AI is somehow a moral obligation.

shaunsingh commented 1 year ago

Ban this one and someone else will develop it. I don't see why an open source implementation isn't welcome. People have been doing this in the form of closed source apps for years

devjev commented 1 year ago

Ban this one and someone else will develop it. I don't see why an open source implementation isn't welcome. People have been doing this in the form of closed source apps for years

By that logic you really should get into human trafficking.

shaunsingh commented 1 year ago

I don't follow. Not sure if open source humman trafficking exists.

Faultless commented 1 year ago

I truly love this rather contemporary notion of: "It could be worse/someone else would do it, therefore we get to do whatever the fk we want with complete disregard for pushback related to ethics/morality/human dignity concerns.". Truly remarkable that this is the mental level at which top talents operate.

devjev commented 1 year ago

I don't follow. Not sure if open source humman trafficking exists.

Well, why don't you start? Surely making it open source will solve all of the moral problems with it. /s

wrq commented 1 year ago

I don't follow. Not sure if open source humman trafficking exists.

Sure it does, it’s called military service 😂

florianseidlschulz commented 1 year ago

Make that argument for the nuclear bomb. I am waiting for the good side of those.

We averted several world wars because our leaders care about there own lifes.

GeroBH commented 1 year ago

An axe is mainly used for chopping wood. It's also used for slaughtering chicken and occasionally someones skull gets split by it. That is not the fault of the axe. It's a tool and it's the user who decides how to use it. I use this repo to create awareness. In live-chats i was shocked to see that the therm 'Deep Fake' is not known to many. First there was laughter, then curiosity and then the knowledge that you should carefully judge what you see. A lesson learned. Next i will place Klaus Schwabs face on Xerxes in a clip from 300 and Antony Fauci on a clip from Chaplins Dictator. I have no ethics/morality issues doing so and are thankful to have the tool to do it. In fact i see it as my ethical and moral obligation to use the tool this way.

That other use the tool to place the face of a celebrity on naked bodies and violate their dignity is not a problem with the tool.

tieje commented 1 year ago

... This will for sure be used for malicious purposes. We can't stop it forever. But we can slow it down. We need more time to mature as a society. End poverty first, which is the main driver of scamming. Then, in the post-scarcity economy, we can focus on things like this for fun and intellectual pursuit.

ArseniyShestakov commented 1 year ago

Dont stop developing this tech. Face swapping is a great way to generate content for games as well as keeping anonimity for anyone who want to have visual avatar while not showing their real face.

Also malicious actors will always have money to pay other apps, but open source solution give access to people to experiment without having lump sum of money.

devjev commented 1 year ago

An axe is mainly used for chopping wood. It's also used for slaughtering chicken and occasionally someones skull gets split by it. That is not the fault of the axe. It's a tool and it's the user who decides how to use it.

I use this repo to create awareness. In live-chats i was shocked to see that the therm 'Deep Fake' is not known to many. First there was laughter, then curiosity and then the knowledge that you should carefully judge what you see. A lesson learned.

Next i will place Klaus Schwabs face on Xerxes in a clip from 300 and Antony Fauci on a clip from Chaplins Dictator.

I have no ethics/morality issues doing so and are thankful to have the tool to do it.

In fact i see it as my ethical and moral obligation to use the tool this way.

That other use the tool to place the face of a celebrity on naked bodies and violate their dignity is not a problem with the tool.

That would've been credible if you could chop wood with DeepFace.

Honestly, pretending that this tool that's highly specialized for shitty use cases is OK because there are a few niche normal use cases is BS.

GeroBH commented 1 year ago

This will for sure be used for malicious purposes < This is exactly why this tool is important. To create awareness. The people that fell for the 10 million $ from Burundi by mail or email, the grandmother that fell for the grandchild fraud by phone could not be saved by banning mail or phones. Video-Call your grandma once as Biden and tell her it's you and she will never be scammed.

devjev commented 1 year ago

To everyone claiming that this tool is somehow important to counteract the negatives of this technology:

This repo is almost a year old, 320 commits in, 37 releases and exactly 0 mentions on how to actually counteract malicious use of this tech.

At this point, a fart is more credible than any such arguments.

GeroBH commented 1 year ago

Everyone is entitled for their opinion. When insults and feces language starts replacing arguments, you know the exchange of views is over.

devjev commented 1 year ago

There is zero evidence here that it was ever the intention of this repo to somehow counteract the negative effects of this tech. Your mental gymnastics to convince people to disregard that is what stopped the "exchange of views".

shaunsingh commented 1 year ago

To go back to a previous analogy, an axe is sold with the promise of breaking wood. If users want to use it for other legal uses, thats fine. If users want to use it for illegal uses, thats on the user. If someone does something illegal with the axe, thats on the person who committed the crime, not the axe company. We shouldn't stop worldwide sale of useful tools like axes just because you can use them for malicious purpose.

I can use this tool to deepfake a "clean" version of my face on days where I don't want to tidy up for a meeting, people with disorders or injuries can use it to look how they'd like online regardless of their physical image. There are valid, good uses for this technology, and the authors mention it in their paper https://arxiv.org/pdf/2005.05535.pdf. I encourage you to read these paragraphs in particular

Numerous spoof videos synthesized by GAN-based face-swapping methods are published on YouTube and other video websites. Commercial mobile applications such as ZAO2 and FaceApp3 which allow general netizens to cre- ate fake images and videos effortlessly significantly boost the spreading of these swapping techniques, called deep- fake. These content generation and modification technologies may affect public discourse quality and infringe upon the citizens’ right of portrait, especially given that deepfake may be used maliciously as a source of misinformation, ma- nipulation, harassment, and persuasion. Identifying manip- ulated media is a technically demanding and rapidly evolv- ing challenge that requires collaborations across the entire tech industry and beyond. Research on media anti-forgery detection is being invig- orated and dedicating growing efforts to forgery face detec- tion. DFDC4 is a typical example, a million-dollar competi- tion launched by Facebook and Microsoft. Training robust forgery detection models requires high-quality fake data. Data generated by our methods are involved in the DFDC dataset[5]. However, detection after being attacked is not the unique manner for reducing the malicious influence of deepfake. It is always too late to detect spreading spoofing content. In our perspective, for both academia and the general public, helping netizens know what deepfake is and how a cinema- quality swapped video is generated is much better. As the old saying goes:“The best defense is a good offense”. Making general netizens realize the existence of deepfake and strengthening their identification ability for spoof me- dia published in social networks is much more critical than agonizing the fact whether spoof media is true or not.

The core technology that goes behind this is quite simple. This repo is just one implementation, and as linked above, anyone can implement themselves if they'd like.

The readme could certainly use a little more info on how this tech can be used for good though.

MariusPalys commented 1 year ago

It amuses me so much when people talk about how they can be fooled by deepfake. At the same time, they read massmedia every day, where the news is [...] deceptive [...]

I find it quite ironic that you are the main maintainer on a repo which will - give it some time - be a tool which will make this soooo much easier...

Also I'm aware that in this early state deepfakes will not be able to fool most people. Or trained people. But you playing naive if you don't think that with advancing technologies and (if maintained) a better algorithm, spotting a fake will be much harder. Or it would take much less input to train. Give it 5 years from now. Maybe more and we'll talk again about this point.

And just to clarify: I'm not for closing or stopping this technology. Do what you please. It is fascinating for sure. I just could not leave some opinions from the team or followers unanswered which are: a) there are way more good uses by this than bad ones b) yeah I know, but there are bad people out there so what can you do

It is more about the "with great power comes great responsibility something something" part 🕸️

HWiese1980 commented 1 year ago

Several points...

  1. developing and open sourcing this tool (hopefully) raises an awareness that is desperately needed in the age of generative AI that is inevitably going to come
  2. publishing it as open source is the only way to make sure it does not fall into the wrong hands - not exclusively at least - and go unnoticed
  3. "if we don't do it, someone else will" is indeed a fucked up argument. I don't like it either. However, it would be dangerously naive to think it wouldn't be that way. So wrt point 1 and 2 I say, we are at least the goodest of the bad because we allow (even if not particularly intended in the first place) the public to participate and gain awareness. We don't hide this bomb behind a wall of silence, regardless of its initial purpose 4." with great power comes great responsibility" - exactly! And how better could that responsibility be distributed than completely open source and publicly so that we as a society can learn from the impact and the consequences? What better hands would there be than (theoretically) all the people in the world? It can hardly get any better. Hoping for some altruistic, unbiased company or NGO that would not abuse this technology to gain an individual advantage in whatever way is dangerously naive. The most unbiased (in average) NGO is the general public.
  4. yes, it will be used for porn, it will be used for propaganda, it will be used for bad things, maybe even primarily. It is going to hurt, and a lot! But so does touching a hot cooktop. The first time hurts like hell, but haven't we learned to be appropriately careful with hot cooktops? We'll be much more aware of the risk of awful pain the next time around. We are children, all of us, we found a lighter and there's hardly anyone who can stop us from burning down the house. Will we burn it down? Will there be firefighters to stop the fire? Will we be able to rebuild everything from the ashes afterwards? Or will we - as a society - even be adaptive enough to not light up the curtain and realize what dangerous and marvelous tool we have at our disposal? I have no idea. But I am brave enough to say, let's go and find out!