sometimes chatgpt literally can't do basic math. why does it hallucinate so bad?
yeah dude i totally get what you mean. it's kinda wild how it can write poetry but screws up simple sums. i think it's because it's not actually *doing* math, like, it's predicting the next most likely words based on its training data. so for math, it just 'guesses' what the answer *should* look like, which is why it hallucinates. it's not thinking, it's just generating text. crazy eh?