One prominent author responds to the revelation that his writing is being used to coach artificial intelligence.

By Stephen King

Non-paywalled link: https://archive.li/8QMmu

    • Storksforlegs@beehaw.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      1 year ago

      Youre right, but this is Steven King, when is he not putting out a new book?

      Also hes not struggling for attention, he’s probably Americas most famous author. He speaks out about stuff all the time I dont think its fair to write off his opinion on this being solely a publicity stunt.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Just as long as you don’t in any way absorb the information into your brain it’s okay.

      I understand what authors are trying to say, but legally I don’t quite get how they can say that an AI is doing anything other than reading the books, which obviously they’re allowed to do.

      It is not as if they are making the books publicly available for free, and it’s not as if writing in the style of another author is illegal, so I’m not quite sure what law has been broken here.

      • RyanHeffronPhoto@kbin.social
        link
        fedilink
        arrow-up
        16
        ·
        1 year ago

        It’s baffling to me seeing comments like this as if the ‘AI’ is some natural intelligence just hanging out going around reading books it’s interested in for the hell of it… No. These are software companies illegally using artists works (which we require licensing for commercial use) to develop a commercial, profit generating product. Whatever the potential outputs of the AI are is irrelevant when the sources used to train it were obtained illegally.

        • admiralteal@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          Yeah, and even if it WERE truly intelligent – which these SALAMIs are almost certainly not – it doesn’t even matter.

          A human and a robot are not the same. They have different needs and must be afforded different moral protections. Someone can buy a book, read it, learn from it, and incorporate things it learned from that experience into their own future work. They may transform it creatively or it may plagiarize or it may rest in some grey area in-between where it isn’t 100% clear if it was novel or plagiarized. All this is also true for a LLM “AI”. – But whether or not this process is fundamentally the same or not isn’t even a relevant question.

          Copyright law isn’t something that exists because it is a pure moral good to protect the creative output of a person from theft. It would be far more ethical to say that all the outputs of human intellect should be shared freely and widely for all people to use, unencumbered by such things. But if creativity is rewarded with only starvation, creativity will go away, so copyright exists as a compromise to try and ensure there is food in the bellies of artists. And with it, we have an understanding that there is a LOT of unclear border space where one artist may feed on the output of another to hopefully grow the pot for everyone.

          The only way to fit generative bots into the philosophical framework of copyright is to demand that the generative bots keep food in the bellies of the artists. Currently, they threaten it. It’s just that simple. People act like it’s somehow an important question whether they “learn” the same way people do, but the question doesn’t matter at all. Robots don’t get the same leeway and protection afforded to humans because robots do not need to eat.

          • Storksforlegs@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Robots don’t get the same leeway and protection afforded to humans because robots do not need to eat.

            Well said.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          These are software companies illegally using artists works

          There is nothing illegal about what they’re doing. You may want it to be illegal, but it’s not illegal until laws are actually passed to make it illegal. Things are not illegal by default.

          Copyright only prevents copying works. Not analyzing them. The results of the analysis are not the same as the original work.

          • RyanHeffronPhoto@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            It is illegal. As an artist, if another individual or company wants to use my work for their own commercial purposes in any way, even if just to ‘analyze’ (since the analysis is part of their private commercial product), they still need to pay for a license to do so. Otherwise it’s an unauthorized use and theft. Copyright doesn’t even play into it at that point, and would be a separate issue.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              No, it’s not. Something that is merely in the style of something else is not a derivative work. If that were the case there’d be lawsuits everywhere.

              • anachronist@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 year ago

                LLMs regurgitate their training set. This has been proven many times. In fact from what I’ve seen LLMs are either regurgitating or hallucinating.

                there’d be lawsuits everywhere

                Early days.

      • Phanatik@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        LLMs have been caught plagiarising works, by the simple nature of how they function. They predict the next word based on an assumed context of the previous words, they’re very good at constructing sentences but often the issue is “where is it getting its information from?” Authors never consented to their works being fed into an optimisation algorithm and neither did artists when DALL E was created.

        For authors, you buy the book and thus the author is paid but that’s not what happened with ChatGPT.

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Authors never consented to their works being fed into an optimisation algorithm

          Yeah I know they didn’t but at worst the company owes them 30 bucks for the licence. I don’t think copyright law gives authors the right to say who can buy their works, so at the absolute worst, the AI company’s stole a book.

          To be clear I’m not saying that this should be allowed, I’m just saying that under the current legal system I’m not sure they actually committed that much of a crime. Obviously it needs to be updated, but you do that through political reform (and good luck with that because AI is big bucks), not through the courts.

          • Phanatik@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Copyright Law doesn’t talk about who can consume the work. ChatGPT’s theft is no different to piracy and companies have gotten very pissy about their shit being pirated but when ChatGPT does it (because the piracy is hidden behind its training), it’s fine. The individual authors and artists get shafted in the end because their work has been weaponised against them.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Copyright Law doesn’t talk about who can consume the work.

              What law does talk about it, then?

        • Duxon@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          LLMs have been caught plagiarising works

          Any source for this? I have never seen that.

          I’m highly skeptical about GPT4 having been directly trained on copyrighted material by Stephen King. Simply by all the sheer information about his works, including summaries, themes, characters, and critical analyses that are publicly available, a good LLM can appear to be able to plagiarize these works, while it doesn’t. If I’m right, there is no leverage for creators to complain. Just accept that that’s the world we’re living in now. I don’t see why this world will stop the sales of books or movie rights on books, etc.

    • TheDankHold@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      You have, LLMs don’t read because they aren’t intelligent or alive. They aren’t comparable to humans.