ADAM ENGST 21 July 2025
Who among us has not Googled themselves to see how we’re represented online? When humor columnist Dave Barry did that recently, he was presented with one of Google’s “People also ask” questions: “What happened to Dave Barry?” Curious, he clicked it and went down a maze of twisty little AI answers, all different and many inaccurate. In the first, he had passed away from cancer, but in later answers, it became clear that he was being confused with another Dave Barry from Dorchester, MA.

Barry’s Death by AI post captures this absurdity in his inimitable style. (I’m unreasonably amused by his description of Google as “a huge company with a vast network of computers processing more than a billion trilobites of data every second.”) His attempts to submit corrections were met by Google’s feedback assistant seemingly translating his corrections into Latvian, parsing them in English, and then throwing up its virtual hands in confusion.
Eventually, though, Google managed to align its AI with reality. Now, when I ask Google that question, I get something along these lines.

The only confusion involved was generated by Google itself.
After conducting extensive research for “AI Answer Engines Are Worth Trying” (17 April 2025), I was curious to see how ChatGPT, Claude, Google’s own Gemini, Perplexity, and You.com would answer the question. Spoiler: they all did fine, though the answers from Claude and You.com were once again the weakest.

ChatGPT

Claude

Gemini

Perplexity

You.com
Having Google declare you dead might be a gift for a professional humor columnist, but these AI Overview errors highlight how dedicated AI answer engines can deliver more accurate results.
I don’t know why Google’s AI Overview is so much worse than Gemini, but I suspect it has to do with scalability and response time. When I asked, “What happened to Adam Engst?” all the AI answer engines responded correctly but took 3–15 seconds to start returning information. Given the exponentially larger number of queries that Google fields and its desire to do so nearly instantly (0.19 seconds for that query), you can see why Google might be willing to sacrifice AI accuracy for speed.
By the way, in the responses to my navel-gazing curiosity, Perplexity earned bonus points for acknowledging my Internet doppleganger, who’s a lawyer in Seattle, while Claude lost points for being by far the slowest and for providing last year’s information instead of something more recent.