• davel
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    “No ChatGPT here—our em dashes are organic.”

      • Marat
        link
        fedilink
        arrow-up
        4
        ·
        8 months ago

        Yeah I’ve seen this quite a lot. You ask an ai to not do something and it’ll do it more. And then if you omit the instruction sometimes it will not do it at all. I’m not a programmer so take this with a grain of salt, but why not have a separate part of the algorithm be “if asked for not x, write response and then delete all x.” Obviously for more complex things that wouldn’t work but for the simpler stuff maybe it would

        • knfrmity
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 months ago

          Negations are hard. It takes kids until they’re about five years old to start to understand the concept of not.

          • Marat
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            That’s fair, like I said I’m not a programmer at all so I don’t have a say here.