Mashable 04月25日 01:44
Google AI overviews will explain any nonsense phrase you make up
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

文章探讨了谷歌AI概述(AI Overviews)在搜索结果中生成错误信息的问题。 这种AI工具倾向于自信地给出答案,即使在缺乏足够数据的情况下,也容易编造信息,例如对虚构的习语进行解释。 尽管谷歌承认AI概述可能出错,但它仍然在搜索结果中占据重要位置。 文章指出,这种行为有时虽然有趣,但有时也可能导致误导,提醒用户在使用AI搜索时要保持警惕。

😂 谷歌AI概述对虚构的习语进行解释,即使是无意义的短语,它也会试图赋予其含义。例如,对于“You can't lick a badger twice”这样的短语,AI会给出自己的解释。

🤔 这种现象被称为“AI-splaining”,用户可以通过在谷歌搜索中输入任何随机句子,并在后面加上“meaning”来触发这种行为。这种行为暴露出AI在信息不足时容易产生幻觉或编造事实的倾向。

⚠️ 虽然这种错误有时看起来很有趣,但AI概述也可能犯一些相对无害的错误,例如,给出NFL加时赛规则的错误信息。谷歌也警告用户,AI概述可能会出错,但它仍然出现在许多搜索结果的顶部。

🍕 早期,AI概述甚至建议人们吃石头和在披萨上涂胶水,这表明AI生成的内容可能存在严重的错误,需要用户保持警惕。

Google's AI Overviews sometimes acts like a lost man who won't ask for directions: It would rather confidently make a mistake than admit it doesn't know something.

We know this because folks online have noticed you can ask Google about any faux idiom — any random, nonsense saying you make up — and Google AI Overviews will often prescribe its meaning. That's not exactly surprising, as AI has shown a penchant for either hallucinating or inventing stuff in an effort to provide answers with insufficient data.

In the case of made-up idioms, it's kind of funny to see how Google's AI responds to idiotic sayings like "You can't lick a badger twice." On X, SEO expert Lily Ray dubbed the phenomenon "AI-splaining."

Someone on Threads noticed you can type any random sentence into Google, then add “meaning” afterwards, and you’ll get an AI explanation of a famous idiom or phrase you just made up. Here is mine

[image or embed]

— Greg Jenner (@gregjenner.bsky.social) April 23, 2025 at 6:15 AM

Fantastic technology, glad society spent a trillion dollars on this instead of sidewalks.

[image or embed]

— Dan Olson (@foldablehuman.bsky.social) April 21, 2025 at 12:01 AM

New game for you all: ask google what a made-up phrase means.

[image or embed]

— Crab Man (@crabman.bsky.social) April 18, 2025 at 1:40 AM

I tested the "make up an idiom" trend, too. One phrase — "don't give me homemade ketchup and tell me it's the good stuff" — got the response "AI Overview is not available for this search." However, my next made up phrase — "you can't shake hands with an old bear" — got a response. Apparently Google's AI thinks this phrase suggests the "old bear" is an untrustworthy person.

Credit: Screenshot: Google

In this instance, Google AI Overview's penchant for making stuff up is kind of funny. In other instances — say, getting the NFL's overtime rules wrong — it can be relatively harmless. And when it first launched, it was telling folks to eat rocks and put glue on pizza. Other examples of AI hallucinations are less amusing. Keep in mind that Google warns users that AI Overviews can get facts wrong, though it remains at the top of many search results.

So, as the old, time-honored idiom goes: Be wary of search with AI, what you see may be a lie.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

谷歌AI AI概述 搜索引擎 信息误导
相关文章