• davel
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    It’s already here and it’s called model collapse: where LLMs’ garbage output becomes LLMs’ new training input.

    • albigu
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      What will happen to GPT-{n} once LLMs contribute much of the language found online?

      Absolutely horrifying thought. Wasn’t PageRank literally invented to solve shit like this?

      This could create a funny circumstance, where LLM companies have to devise methods for automatic detection of LLMs text, but then normal people can just use those methods to filter out all LLM junk.