• JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    15
    ·
    10 months ago

    You can’t stop them being made, they’re just the same deepfakes people have been making before. It’s important to note that they’re not photos of people, they’re guesses made by a algorithm.

    • strider@feddit.nl
      link
      fedilink
      English
      arrow-up
      62
      arrow-down
      3
      ·
      10 months ago

      While you’re completely right, that’s hardly a consolation for those affected. The damage is done, even if it’s not actually real, because it will be convincing enough for at least some.

        • ParsnipWitch@feddit.de
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          10 months ago

          I think the people who made the pictures have to suffer consequences. Otherwise this sends the message as if it was just fair game to behave that way.

    • InternetTubes@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      10
      ·
      edit-2
      10 months ago

      If governments can go after child porn, then they can go after the websites generating it and people distributing it.

      I’m sort of sick about services that can generate whatever bullshit people ask of them with zero oversight and control, specially when it involves deepfakes. When deepfakes become real enough, societies will just become a race towards distributing the deepfakes that serve whatever passes as the prejudices of the times, and people will eat it up.

      It already happens in societies without deepfakes, and even the people who disagree with the mainstream still adopt their perception of things towards the prejudices present in the media of their society that they don’t really become aware off until they try living outside of it for a while.

      Deepfakes will become like steroids for creating bubbles of ideology once it is able to cross the uncanny valley territory.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        ·
        10 months ago

        This stuff can be run locally. Its not something that can be stopped by just going after some service providing it. It may make it slightly less convenient to access, but if anyone wants to access it it’ll be available. Pandora’s box has been opened and it can’t be closed.

        • Touching_Grass@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          10 months ago

          The goal isn’t to stop deepfakes of random people. Its to limit AI access to regular people so it can be horded by select groups of people. Using threats against children to stir up the masses is the oldest play in history. The upper crust needs to make laws against how the rest of us use these tools.

        • InternetTubes@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          You can also have your hard drive loaded with child porn locally, and it wouldn’t be any less illegal.

          • Cethin@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 months ago

            Sure, it’s illegal. They can’t do anything about it unless you do something else wrong though. I wish they could just magically detect where that content was, but they need a search warrant to find it. Talking about stopping this software will lead to nothing, but sharing this content (real or generated) is where attention should be focused.

        • InternetTubes@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          10 months ago

          Pretty sure some places do ban carrying weapons like swords and machetes in public. Another thing I’m also sick is people who act like there’s no middle ground or precedent or possibility of nuance by making gross caricatures of things.

          • Touching_Grass@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            10 months ago

            Before you can operate any AI you will need a license and inform the government what you intend to do develop with it.

    • n0m4n@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      edit-2
      10 months ago

      The faces are not generated, and that is where the damage comes. It targets the girls for humiliation by implying that they allowed the nudes to be taken of them. Depending upon the location and circumstances, this could get the girls murdered. Think of “honor killings” by fundamentalists. It makes them targets for further sexual abuse, too. Anyone distributing the photos are at fault, as well as the people who made the photos.

      The problem goes deeper, though. We can never trust a photo as proof of anything, again. Let that sink in, what it means to society.

    • maegul@lemmy.ml
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      5
      ·
      10 months ago

      To push back your attempt to minimalise what’s going on here …

      Yes, they’re not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

      If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past … all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it’s no longer embarrassing?

      Psychologically, I’d say it’s pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

      There’s probably some range of realism within which there’s an embarrassing effect, and I’d bet AI is very capable of getting in that range pretty easily these days.

      While the technology is out there now … it doesn’t mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we’re yet to culturally curb the abuses of this technology.

        • Adlach
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          10 months ago

          Is there a reason you didn’t have time to read but you did have time to comment that you didn’t read and make yourself look like an asshole

    • Rayspekt@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      10 months ago

      Exactly, the technology is out there and will not cease to exist. Maybe we’ll digitally sign our photos in the future so that deepfakes can be sorted out by that.