LLMs operate using tokens, not letters. This is expected behavior. A hammer sucks at controlling a computer and that's okay. The issue is the people telling you to use a hammer to operate a computer, not the hammer's inability to do so
People who make fun of LLMs most often do get LLMs and try to point out how they tend to spew out factually incorrect information, which is a good thing since many many people out there do not, in fact, "get" LLMs (most are not even acquainted with the acronym, referring to the catch-all term "AI" instead) and there is no better way to make a precaution about the inaccuracy of output produced by LLMs –however realistic it might sound– than to point it out with examples with ridiculously wrong answers to simple questions.