Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
finance.yahoo.com Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
The chatbot gave wildly different answers to the same math problem, with one version of ChatGPT even refusing to show how it came to its conclusion.

You're viewing a single thread.
All Comments
70 comments
Send_me_nude_girls @feddit.de
Must be because of all the censoring. The more they try to prevent DAN jailbreaking and controversial replies, the worse it got.
48 4 Reply
70 comments
Scroll to top