• manitcor@lemmy.intai.tech
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    you need to override that overly helpful system message they put in chatgpt.

    “be concise, do not explain, only provide the answer”

  • flimsyberry@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Chatgpt is not good at these kinds of problems. Heck, any solution and/or answer it gives should be checked manually if you intend to apply it somewhere. It may seem smart most of the time, but you’ve just hit one of its bigger weaknesses.

    • memmytesting1@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      The worst part is the way it states these thing as fact instead of just “so you should be…”. Definitely tricks people. Obviously this one is easy to spot, but it does the same thing no matter whether it is right or wrong, and if you have no idea about a topic you’ll just believe what you’re reading is correct.

      I only ever use this for quick things that I have a pretty strong grasp on but still need a little push over the hill to solve a particular problem. Almost always I get a response that’s only half correct at best, but something inside the response I can pull out and use to solve the issue I’m having.

      • flimsyberry@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I think that’s a pretty good approach. I think it would benefit all to see ChatGPT more like an approximation/guessing machine. It often hits the mark, or even gets really close. Its bigger hallucinations are frequently hilarious.