GPT finally gives in to some bullying…

*edited to sort images chronologically

  • DamarcusArt
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 months ago

    You’re speaking to it like it is a human. It isn’t. You’re typing instructions into a computer program. If you want it to make you something, you just need to be very direct and specific. These chatbots get into weird loops where they’ll refuse requests because they are “related” to other unacceptable requests, and they cannot understand concepts like context.

    Kinda seems like you’re getting mad at a vending machine because you were mashing buttons wildly and it wasn’t giving you anything tbh.