ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer.
ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer.
Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn't know the answer, it would have been trustworthy.
You're viewing a single thread.
GBU_28 @lemm.ee Have it cite it's source.
5 1 Replykromem @lemmy.world It will make up citations.
8 0 ReplyGBU_28 @lemm.ee You go, and read the citations.
2 0 Replykromem @lemmy.world Even with early GPT-4 it would also cite real citations that weren't actually about the topic. So you may be doing a lot of work double checking as opposed to just looking into an answer yourself from the start.
4 0 ReplyGBU_28 @lemm.ee It's a tool, not a genie.
1 1 Reply
sploosh @lemmy.world I didn't mean to cause any confusion, but what I said before was utter bullshit.
2 0 Replyjh34ghu43gu @lemmy.world
If you use kagi its AI gives sources https://kagi.com/fastgpt https://imgur.com/TYQErhC
1 0 Reply