CorentinJ / Real-Time-Voice-Cloning

Clone a voice in 5 seconds to generate arbitrary speech in real-time
Other
52.14k stars 8.72k forks source link

Ethical usage section in README #95

Closed dancrew32 closed 5 years ago

dancrew32 commented 5 years ago

Hey! First off, I want to say this is amazing work.

As this gains popularity, I'm positive tech like this will be used for abusive reasons. It might be worth suggesting ethical and unethical usage examples.

I see that deepfakes/faceswap has a similar section in their README: https://github.com/deepfakes/faceswap#faceswap-has-ethical-uses

Thanks for your time and consideration on this.

CorentinJ commented 5 years ago

I don't like the idea. I suspect my answer is not going to be popular, so feel free to try and change my mind.

First off, you won't do anything that is too good with this repo. It's a proof of concept and also the work of a master's student alone so far. The voice generated sounds ok but it don't see it fooling anyone as it is. Especially given the lack of style control, you're often going to end with a very monotonic voice. Asking people not to use this to do evil sounds like unworthy sensational PR. If I get solid contributions from other developers, then maybe I will reconsider this.

Secondly, if you have bad intentions but you change your mind because I politely asked you not to do it, then you need to re-evaluate your villain career. Yeah, faceswap has that fancy paragraph that almost brings a tear to my eye, but how has that been working out for them? I have no way of verifying that you're going to use this repo without bad intents, and I don't see myself closing the doors on valuable work simply because a few want to abuse it. Again, this is considering my first point. If this were the perfect voice cloning software, I'd reconsider it à la OpenAI.

flarn2006 commented 4 years ago

I think people need to stop relying on human limitations like the inability to perfectly duplicate someone else's voice, and on the unavailability of technology to do certain things. Or if they do rely on it, they at least need to accept that they're taking a risk by doing so. Sure, it's better if people don't use stuff like this with bad intentions (or use anything with bad intentions, for that matter) but at least there's a silver lining in that the more people do, the more society realizes the risk they were taking. And hopefully the more things like that happen, the more people will get used to these types of technological barriers dissolving, and will stop thinking and acting like they somehow have a responsibility to keep said barriers in place, or thinking and acting like they're responsible for how other people use their work.

I loved how the whole Deepfakes controversy went, and I have nothing but respect for the Fakeapp developer for letting his software into the wild, instead of continuing to delay the inevitable. Same for you. :+1: