You must log in or # to comment.
“No ChatGPT here—our em dashes are organic.”
deleted by creator
Yeah I’ve seen this quite a lot. You ask an ai to not do something and it’ll do it more. And then if you omit the instruction sometimes it will not do it at all. I’m not a programmer so take this with a grain of salt, but why not have a separate part of the algorithm be “if asked for not x, write response and then delete all x.” Obviously for more complex things that wouldn’t work but for the simpler stuff maybe it would
Negations are hard. It takes kids until they’re about five years old to start to understand the concept of not.
That’s fair, like I said I’m not a programmer at all so I don’t have a say here.




