Didn’t ChatGPT become very bad recently? It used to give really working code but now it gets things wrong and doesn’t follow context. It gives code but when you ask it to improve by give more context, it ignores the previous answer and give wrong code.
It even sometimes answers by saying it does not have the answer for questions that it answered few months ago.
Didn’t ChatGPT become very bad recently? It used to give really working code but now it gets things wrong and doesn’t follow context. It gives code but when you ask it to improve by give more context, it ignores the previous answer and give wrong code.
It even sometimes answers by saying it does not have the answer for questions that it answered few months ago.
Last time I asked a niche api question it showed me how to formulate the question so I could post it on this GitHub issues…
Edit
That looks like advice on how NOT to ask for technical support on a public forum.
The latest update from openai calls this “laziness” and discusses a fix coming
Whoa really? AI “laziness” is actually a really interesting concept imo
I keep telling the stupid thing to stop wasting time and space apologizing, and it won’t.