The Verge - Artificial Intelligences 2024年12月17日
Meta rolls out live AI, live translations, and Shazam to its smart glasses
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Meta的Ray-Ban智能眼镜推出新功能,包括面向Early Access Program成员的实时AI、实时翻译,以及面向美国和加拿大所有用户的Shazam。实时AI可与AI助手自然交流,实时翻译支持多语言,Shazam能识别歌曲。还提到了软件更新及相关情况。

🎈Meta智能眼镜新增实时AI,可与AI助手自然交流并提供建议。

💬实时翻译功能可在英语与西、法、意语间实时转换。

🎵Shazam支持让用户通过眼镜识别听到的歌曲。

More AI features are rolling out to the Ray-Ban Meta Smart Glasses. | Photo by Amelia Holowaty Krales / The Verge

Meta just announced three new features are rolling out to its Ray-Ban smart glasses: live AI, live translations, and Shazam. Both live AI and live translation are limited to members of Meta’s Early Access Program, while Shazam support is available for all users in the US and Canada.

Both live AI and live translation were first teased at Meta Connect 2024 earlier this year. Live AI allows you to naturally converse with Meta’s AI assistant while it continuously views your surroundings. For example, if you’re perusing the produce section at a grocery store, you’ll theoretically be able to ask Meta’s AI to suggest some recipes based on the ingredients you’re looking at. Meta says users will be able to use the live AI feature for roughly 30 minutes at a time on a full charge.

Meanwhile, live translation allows the glasses to translate speech in real-time between English and Spanish, French, or Italian. You can choose to either hear translations through the glasses themselves, or view transcripts on your phone. You do have to download language pairs beforehand, as well as specify what language you speak versus what your conversation partner speaks.

Shazam support is a bit more straightforward. All you have to do is to prompt the Meta AI when you hear a song, and it should be able to tell you what you’re listening to. You can watch Meta CEO Mark Zuckerberg demo it in this Instagram reel.

View this post on Instagram

A post shared by Mark Zuckerberg (@zuck)

If you don’t see the features yet, check to make sure your glasses are running the v11 software and that you’re also running v196 of the Meta View app. If you’re not already in the Early Access Program, you can apply via this website.

The updates come just as Big Tech is pushing AI assistants as the raison d’etre for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, and specifically positioned its Gemini AI assistant as the killer app. Meanwhile, Meta CTO Andrew Bosworth just posted a blog opining that “2024 was the year AI glasses hit their stride.” In it, Bosworth also asserts that smart glasses may be the best possible form factor for a “truly AI-native device” and the first hardware category to be “completely defined by AI from the beginning.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Meta智能眼镜 AI功能 实时翻译 Shazam
相关文章