This is probably what’s going on with the hilarious ChatGPT faceplants making the rounds on social media.
People try to fool GPT with esoteric questions, but those are easy for it: if anybody anywhere on the web already answered the question, no problem — and making it esoteric just narrows the search space.
But give it a three-digit addition problem, and there’s no single specific example to match. And GPT can’t turn all those examples into a generalized theory of how to do addition.
8/