https://www.duluthnewstribune.com/news/minnesota/deepfake-porn-election-disinformation-move-closer-to-being-crimes-in-minnesota got me thinking about the issue.

25 comments
  1. This subreddit is for civil discussion; political threads are not exempt from this. As a reminder:

    * Do not report comments because they disagree with your point of view.

    * Do not insult other users. Personal attacks are not permitted.

    * Do not use hate speech. You will be banned, permanently.

    * Comments made with the intent to push an agenda, push misinformation, soapbox, sealion, or argue in bad faith are not acceptable. If you can’t discuss a topic in good faith and in a respectful manner, do not comment. **Political disagreement does not constitute pushing an agenda.**

    If you see any comments that violate the rules, **please report it and move on!**

    *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AskAnAmerican) if you have any questions or concerns.*

  2. I can’t open this because I’m at work, someone please tell me who won the deepfake porn election

  3. Free speech!

    I think wholesale making it a crime is a terrible idea; they should require, like, watermarking or something.

    I don’t think there’s really any good way to handle it.

  4. With the exception of using it in court case, deep fakes should treated like libel/slander cases

  5. I think they should just be illegal across the board. I can’t see a middle ground with this. I am absolutely not okay with anyone having porn made with their likeness who hasn’t consented to it; it’s traumatizing and even life-ruining.

    I would also like to see a gender split survey on peoples’ opinions of porn deepfakes. I suspect there is a very strong male/female split in opinion (isn’t it curious how the group less likely to be targeted by it is generally more lax about it than the group significantly more likely to be affected?)

    As for political ones, I guess I can’t see any logical reason not to apply the same rules as porn. It may not be as traumatizing but the point remains that you don’t have the freedom to just manipulate a person’s likeness however you please; we already have libel laws, this should fall under that category. And god knows the political landscape doesn’t need any more disinformation than it already has.

  6. I don’t see how it should be legal to do that to someone. I’d honestly sexually violated in some way and can see how it can ruin someone’s life. Should be treated similarly to slander.

  7. I mean if you present a fake video as real, it could be construed as defamatory if the people represented in the video are presented in a negative way. This is already the case for videos edited the old fashioned way.

    We already have laws against defamation, fraud, etc. There is no need for new laws about “deepfakes” which, while visually versimilitudinous, are still easily proven as not real.

  8. Sexual and obscene art and pornography have been protected by the courts over and over, So I don’t see how creating or having a deep fake in your possession could be made illegal. Using on purpose to blackmail, or otherwise harm the reputation of a person is already a crime.

    The big problem to any law regarding the creation of a deep fake will be proving the video is not real, the quality is only going to improve over time.

  9. This is just one of those things that’s impossible to actually outlaw. Good luck to anybody in this thread suggesting that you can successfully outlaw anything someone with access to a moderately powerful computer can do anonymously. I’d bet it’ll eventually fall into the same category as online piracy. Still illegal on paper, but good luck actually prosecuting anybody with it given how easy and anonymous it is.

  10. Any law would be unenforceable considering the barrier to entry is a decent PC and some free time.

    Also, the SCOTUS has consistently proven its willingness to uphold pornography as a 1st amendment issue. I don’t see how fake porn is ever going to be penalized more than real porn.

    Libel/slander might appropriate but the person saying something has to be trying to pass it off as truth knowingly, which is already hard enough to prove with regular media. It’s basically impossible to prove that kind of intent if someone is just posting a video on the internet.

  11. It would be impossible to ban deepfakes outright, especially since Hollywood is starting to use them for VFX.

    The only realistic thing I could see would be to put them under revenge porn laws or something similar.

  12. I don’t know if you need any. I mean, if you use it for personal use or what would be considered fair use (parody, etc.) under current law, I don’t see a problem with it. If you use it for nefarious purposes, it would probably be fraud or libel/slander under current law.

  13. I think it’s a complex issue and think people might be acting a little heavy handed here.

    I think it’s one thing if you’re attempting to use a deepfake to commit fraud but struggle to see who the victim is if I ask AI to generate a video of Helen Mirren getting fucked by Kermit the Frog.

  14. I think that we should adapt existing laws about libel, slander, misappropriation of personal image, etc.; to provide harsher penalties when a deepfake is used to deliberately deceive, gain political advantage, or for other nefarious purpose. I also think that there should be strict penalties for people who create deepfake porn of people without their consent. I won’t try to put it on the level of a physical sexual assault. But that still must be horribly violating.

  15. The only restriction is ypu should not be able to use them to intentionally spread disinformation or slander someone, and probably not use it to create sexual content either without the persons permission.

  16. Existing laws are in place for slander and libel.

    There is digital watermarking technology as well as new AI techniques for determining if videos are genuine. We should encourage media companies to watermark videos as a first step.

  17. Seems like it should fall under existing laws against criminal impersonation/personation, libel, slander, defamation & fraud. Maybe some of those laws should be modified to expressly include deepfakes that haven’t been consented to.

  18. Not convinced this needs to be treated any differently than photoshop. Which is to say very little if any regulation.

    In fact it may violate the first amendment to do so.

  19. If someone can get a signed agreement with a person to make deep fake video using his image, then it could be legal. Where that video could be used and under what conditions must also be in the legal contract. Otherwise it is invasion of privacy and must be considered as a crime.

  20. We didn’t need a law about photoshops and we don’t need a law for deepfakes.

  21. There shouldn’t be any laws regulating it.

    If making fakes is illegal, a large percentage of the population will believe all videos they see are real.

  22. No one cares yet because so far its harm has only been felt by women. Once deepfake porn expands past women to make men look like pedophiles and/or rapists, they will start to care and enact laws against AI porn and deepfakes.

Leave a Reply
You May Also Like