• walrusintraining@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    I’ve coded LLMs, I was just simplifying it because at its base level it’s not that different. It’s just much more convoluted as I said. They’re essentially selling someone else’s work as their own, it’s just run through a computer program first.

    • PupBiru@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      it’s nothing like that at all… if someone bought a book and produced a big table of words and the likelihood that the next word would be followed by another word, that’s what we’re talking about: it’s abstract statistics

      actually, that’s not even what we’re talking about… we then take that word table and then combine it with hundreds of thousands of other tables until the original is so far from the original as to be completely untraceable back to the original work

      • walrusintraining@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        If it were trained on a single book, the output would be the book. That’s the base level without all the convolution and that’s what we should be looking at. Do you also think someone should be able to train a model on your appearance and use it to sell images and videos, even though it’s technically not your likeness?