[HN] Non-determinism in GPT-4 is caused by Sparse MoE
[HN] Non-determinism in GPT-4 is caused by Sparse MoE
152334h.github.io Non-determinism in GPT-4 is caused by Sparse MoE
It’s well-known at this point that GPT-4/GPT-3.5-turbo is non-deterministic, even at temperature=0.0. This is an odd behavior if you’re used to dense decoder-only models, where temp=0 should imply greedy sampling which should imply full determinism, because the logits for the next token should be a ...
Link Actions
[ comments | sourced from HackerNews