Help.
Help.


Help.
You're viewing a single thread.
I could see myself having conversations with an LLM, but I wouldn't want it to pretend it's anything other than a program assembling words together.
The way it clicks for me is that it's a juiced up auto-complete tool.
It's literally that.
Well that explains why that user thinks it completes them.
If llms are juiced up auto complete then humans are juiced up bacteria. Yeah they both have the same end goal, guess the next word, survive and reproduce , but the methods they use to accomplish them are vastly more complex.
It's not pretending to be anything, that's just the function you described: assembling words together.