You’re speaking to it like it is a human. It isn’t. You’re typing instructions into a computer program. If you want it to make you something, you just need to be very direct and specific. These chatbots get into weird loops where they’ll refuse requests because they are “related” to other unacceptable requests, and they cannot understand concepts like context.
Kinda seems like you’re getting mad at a vending machine because you were mashing buttons wildly and it wasn’t giving you anything tbh.
You’re speaking to it like it is a human. It isn’t. You’re typing instructions into a computer program. If you want it to make you something, you just need to be very direct and specific. These chatbots get into weird loops where they’ll refuse requests because they are “related” to other unacceptable requests, and they cannot understand concepts like context.
Kinda seems like you’re getting mad at a vending machine because you were mashing buttons wildly and it wasn’t giving you anything tbh.