Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

  • Dark Arc@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    11
    ·
    11 months ago

    It’s 100% a new problem. There’s established precedent for things costing different amounts depending on their intended use.

    For example, buying a consumer copy of song doesn’t give you the right to play that song in a stadium or a restaurant.

    Training an entire AI to make potentially an infinite number of derived works from your work is 100% worthy of requiring a special agreement. This even goes beyond simple payment to consent; a climate expert might not want their work in an AI which might severely mischatacterize the conclusions, or might want to require that certain queries are regularly checked by a human, etc

    • bh11235@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      11 months ago

      Well, fine, and I can’t fault new published material having a “no AI” clause in its term of service. But that doesn’t mean we get to dream this clause into being retroactively for all the works ChatGPT was trained on. Even the most reasonable law in the world can’t be enforced on someone who broke it 6 months before it was legislated.

      Fortunately the “horses out the barn” effect here is maybe not so bad. Imagine the FOMO and user frustration when ToS & legislation catch up and now ChatGPT has no access to the latest books, music, news, research, everything. Just stuff from before authors knew to include the “hands off” clause - basically like the knowledge cutoff, but forever. It’s untenable, OpenAI will be forced to cave and pay up.

      • DandomRude@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        11 months ago

        OpenAI and such being forced to pay a share seems far from the worst scenario I can imagine. I think it would be much worse if artists, writers, scientists, open source developers and so on were forced to stop making their works freely available because they don’t want their creations to be used by others for commercial purposes. That could really mean that large parts of humanity would be cut off from knowledge.

        I can well imagine copyleft gaining importance in this context. But this form of licencing seems pretty worthless to me if you don’t have the time or resources to sue for your rights - or even to deal with the various forms of licencing you need to know about to do so.

        • kklusz@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          5
          ·
          11 months ago

          I think it would be much worse if artists, writers, scientists, open source developers and so on were forced to stop making their works freely available because they don’t want their creations to be used by others for commercial purposes.

          None of them are forced to stop making their works freely available. If they want to voluntarily stop making their works freely available to prevent commercial interests from using them, that’s on them.

          Besides, that’s not so bad to me. The rest of us who want to share with humanity will keep sharing with humanity. The worst case imo is that artists, writers, scientists, and open source developers cannot take full advantage of the latest advancements in tech to make more and better art, writing, science, and software. We cannot let humanity’s creative potential be held hostage by anyone.

          That could really mean that large parts of humanity would be cut off from knowledge.

          On the contrary, AI is making knowledge more accessible than ever before to large parts of humanity. The only comparible other technologies that have done this in recent times are the internet and search engines. Thank goodness the internet enables piracy that allows anyone to download troves of ebooks for free. I look forward to AI doing the same on an even greater scale.

          • Flying Squid@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            11 months ago

            Shouldn’t there be a way to freely share your works without having to expect an AI to train on them and then be able to spit them back out elsewhere without attribution?

            • kklusz@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              No, there shouldn’t because that would imply restricting what I can do with the information I have access to. I am in favor of maintaining the sort of unrestricted general computing that we already have access to.

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            11 months ago

            The rest of us who want to share with humanity will keep sharing with humanity. The worst case imo is that artists, writers, scientists, and open source developers cannot take full advantage of the latest advancements in tech to make more and better art, writing, science, and software. We cannot let humanity’s creative potential be held hostage by anyone.

            You’re not talking about sharing it with humanity, you’re talking about feeding it into an AI. How is this holding back the creative potential of humanity? Again, you’re talking about feeding and training a computer with this material.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        11 months ago

        Even the most reasonable law in the world can’t be enforced on someone who broke it 6 months before it was legislated.

        Sure it can. Just because it is a new law doesn’t mean they get to continue benefiting from IP ‘theft’ forever into the future.

        Imagine the FOMO and user frustration when ToS & legislation catch up and now ChatGPT has no access to the latest books, music, news, research, everything. Just stuff from before authors knew to include the “hands off” clause

        How is this an issue for the IP holders? Just because you build something cool or useful doesn’t mean you get a pass to do what you want.

        basically like the knowledge cutoff, but forever. It’s untenable,

        Untenable for ChatGPT maybe, but it’s not as if it’s the end of ‘knowledge’ or the end of AI. It’s just a single company product.

    • cerevant@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      8
      ·
      11 months ago

      My point is that the restrictions can’t go on the input, it has to go on the output - and we already have laws that govern such derivative works (or reuse / rebroadcast).

    • bouncing@partizle.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      6
      ·
      11 months ago

      The thing is, copyright isn’t really well-suited to the task, because copyright concerns itself with who gets to, well, make copies. Training an AI model isn’t really making a copy of that work. It’s transformative.

      Should there be some kind of new model of renumeration for creators? Probably. But it should be a compulsory licensing model.

      • jecxjo@midwest.social
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        11 months ago

        The slippery slope here is that we are currently considering humans and computers to be different because (something someone needs to actually define). If you say “AI read my book and output a similar story, you owe me money” then how is that different from “Joe read my book and wrote a similar story, you owe me money.” We have laws already that deal with this but honestly how many books and movies aren’t just remakes of Romeo and Juliet or Taming of the Shrew?!?

        • bouncing@partizle.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          11 months ago

          If you say “AI read my book and output a similar story, you owe me money” then how is that different from “Joe read my book and wrote a similar story, you owe me money.”

          You’re bounded by the limits of your flesh. AI is not. The $12 you spent buying a book at Barns & Noble was based on the economy of scarcity that your human abilities constrain you to.

          It’s hard to say that the value proposition is the same for human vs AI.

          • jecxjo@midwest.social
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            11 months ago

            We are making an assumption that humans do “human things”. If i wrote a derivative work of your $12 book, does it matter that the way i wrote it was to use a pen and paper and create a statistical analysis of your work and find the “next best word” until i had a story? Sure my book took 30 years to write but if i followed the same math as an AI would that matter?

            • BartsBigBugBag@lemmy.tf
              cake
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              It’s not even looking for the next best word. It’s looking for the next best token. It doesn’t know what words are. It reads tokens.

              • jecxjo@midwest.social
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                11 months ago

                Good point.

                I could easily see laws created where they blanket outlaw computer generated output derived from other human created data sets and sudden medical and technical advancements stop because the laws were written by people who don’t understand what is going on.

            • bouncing@partizle.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              11 months ago

              It wouldn’t matter, because derivative works require permission. But I don’t think anyone’s really made a compelling case that OpenAI is actually making directly derivative work.

              The stronger argument is that LLM’s are making transformational work, which is normally fair use, but should still require some form of compensation given the scale of it.

              • jecxjo@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                11 months ago

                But no one is complaining about publishing derived work. The issue is that “the robot brain has full copies of my text and anything it creates ‘cannot be transformative’”. This doesn’t make sense to me because my brain made a copy of your book too, its just really lossy.

                I think right now we have definitions for the types of works that only loosely fit human actions mostly because we make poor assumptions of how the human brain works. We often look at intent as a guide which doesn’t always work in an AI scenario.

                • bouncing@partizle.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  Yeah, that’s basically it.

                  But I think what’s getting overlooked in this conversation is that it probably doesn’t matter whether it’s AI or not. Either new content is derivative or it isn’t. That’s true whether you wrote it or an AI wrote it.

                  • jecxjo@midwest.social
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    11 months ago

                    I agree with that, but do politicians and judges who know absolutely nothing about the subject?

                    I haf a professor in college who taught about cyber security. He was renowned in his field and was asked by the RIAA to testify about some cases related to file sharing. I lost respect for him when he intentionally refrained from stating that it wasnt possible for anyone outside of the home network yo know what or who was actually downloading stuff. The technology was being ignored and an invalid view was presented for a judge who couldn’t ELI5 how the internet worked let along actually networking protocols.

        • Square Singer@feddit.de
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          11 months ago

          Well, Shakespeare has beed dead for a few years now, there’s no copyright to speak of.

          And if you make a book based on an existing one, then you totally need permission from the author. You can’t just e.g. make a Harry Potter 8.

          But AIs are more than happy to do exacly that. Or to even reproduce copyrighted works 1:1, or only with a few mistakes.

          • Phlogiston@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            11 months ago

            If a person writes a fanfic harry potter 8 it isn’t a problem until they try to sell it or distribute it widely. I think where the legal issues get sticky here are who caused a particular AI generated Harry Potter 8 to be written.

            If the AI model attempts to block this behavior. With contract stipulations and guardrails. And if it isn’t advertised as “a harry potter generator” but instead as a general purpose tool… then reasonably the legal liability might be on the user that decides to do this or not. Vs the tool that makes such behavior possible.

            Hypothetically what if an AI was trained up that never read Harry Potter. But its pretty darn capable and I feed into it the entire Harry Potter novel(s) as context in my prompt and then ask it to generate an eighth story — is the tool at fault or am I?

            • Square Singer@feddit.de
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              11 months ago

              Fanfic can actually be a legal problem. It’s usually not prosecuted, because it harms the brand to do so, but if a company was doing that professionally, they’d get into serious hot water.

              Regarding your hypothetical scenario: If you train the AI with copyrighted works, so that you can make it reproduce HP8, then you are at fault.

              If the tool was trained with HP books and you just ask really nicely to circumvent the protections, I would guess the tool (=> it’s creators) would certainly be at fault (since it did train on copyrighted material and the protections were obviously not good enough), and at the latest when you reproduce the output, you too are.

          • jecxjo@midwest.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            11 months ago

            It seems like people are afraid that AI can do it when i can do it too. But their reason for freaking out is…??? It’s not like AI is calling up publishers trying to get Harry Potter 8 published. If i ask it to create Harry Potter 1 but change his name to Gary Trotter it’s not the AI that is doing something bad, it’s me.

            That was my point. I can memorize text and its only when I play it off as my own that it’s wrong. No one cares that I memorized the first chapter and can recite it if I’m not trying to steal it.

            • Square Singer@feddit.de
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              11 months ago

              That’s not correct. The issue is not whether you play it off as your own, but how much the damages are that you can be sued for. If you recite something that you memorized in front of a handful of friends, the damages are non-existant and hence there is no point in sueing you.

              But if you give a large commercial concert and perform a cover song without permission, you will get sued, no matter if you say “This song is from <insert original artist> and not from me”, because it’s not about giving credit, it’s about money.

              And regarding getting something published: This is not so much about big name art like Harry Potter, but more about people doing smaller work. For example, voice actors (both for movie translations and smaller things like announcements in public transport) are now routinely replaced by AI that was trained on their own voices without their permission.

              Similar story with e.g. people who write texts for homepages and ad material. Stuff like that. And that has real-world consequences already now.

              • jecxjo@midwest.social
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                11 months ago

                The issue is not whether you play it off as your own, but how much the damages are that you can be sued for.

                I think that’s one in the same. I’m just not seeing the damages here because the output of the AI doesn’t go any further than being AI output without a further human act. Authors are idiots if they claim “well someone could ask ChatGPT to output my entire book and you could read it for free.” If you want to go after that type of crime then have ChatGPT report the users asking for it. If your book is accessible via a library I’m not see any difference between you asking ChatGPT to write in someone’s style and asking me to write in their style. If you ask ChatGPT for lines verbatim i can recite them too. I don’t know what legitimate damages they are claiming.

                For example, voice actors

                I think this is a great example but again i feel like the law is not only lacking but would need to outlaw other human acts not currently considered illegal.

                If you do impressions you’re mimicking the tone, cadence and selection of language someone else does. You arent recording them and playing back the recording, you are using your own voice box to create a sound similar to the celebrity. An AI sound generator isn’t playing back a recording either. It’s measuring tone, cadence, and language used and creates a new sound similar to the celebrity. The only difference here is that the AI would be more precise than a humans ability to use their voice.

      • Fedizen@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        11 months ago

        Challenge level impossible: try uploading something long to amazon written by chatgpt without triggering the plagiarism detector.