The Verge - Artificial Intelligences 2024年08月01日
A first look at Apple Intelligence and its (slightly) smarter Siri
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

iOS 18中Siri获得升级,iPhone 15 Pro和Pro Max可体验。Siri在语言理解等方面有改进,还新增多种交互方式。同时,苹果智能在邮件、Notes等应用中也有新功能。

🦘Siri在iOS 18的最新开发者预览中得到升级,启用后手机边缘会发光,提示Siri正在倾听。此版本对语言理解有显著改进,且能更好地解析自然语言,理解后续问题。

📱双击屏幕底部可通过文本与Siri交互,新Siri在邮件应用中有新功能,如邮件顶部的总结按钮,还能在可输入和突出显示文本的地方提供AI校对、写作建议和总结等写作工具。

🎙在Notes应用中,语音记录现在带有自动转录功能,苹果智能可将录音转录转化为摘要或清单,在Photos应用中,搜索工具使用AI理解更复杂的请求。

Siri’s big upgrade starts now, but you’ll need the right iPhone to access it. | Photo by Vjeran Pavic / The Verge

In iOS 18’s latest developer preview, Siri gets a glow-up. Like, the whole phone actually glows around the edges when you invoke Siri.

A splash screen reintroduces you to the virtual assistant once you enable Apple Intelligence, an early version of which is now available on the iPhone 15 Pro and Pro Max in a developer beta. You’ll know Siri is listening when the edges of the screen glow, making it pretty obvious that something different is going on.

The big Siri AI update is still months away. This version comes with meaningful improvements to language understanding, but future updates will add features like awareness of what’s on your screen and the ability to take action on your behalf. Meanwhile, the rest of the Apple Intelligence feature set previewed in this update feels like a party waiting for the guest of honor.

That said, Siri’s improvements in this update are useful. Tapping the bottom of the screen twice will bring up a new way to interact with the assistant: through text. It’s also much better at parsing natural language, waiting more patiently through hesitations and “um”s as I stumble through questions. It also understands when I’m asking a follow-up question.

Double-tapping the bottom of the screen brings up a text box you can use to talk to Siri.
New Siri understands context in follow-up questions, like this one after I asked for the weather in Olympia.

Outside of Siri, it’s kind of an Easter egg hunt finding bits of Apple Intelligence sprinkled throughout the OS. They’re in the mail app, with a summarize button at the top of each email now. And anywhere you can type and highlight text, you’ll find a new option called “writing tools” with AI proofreading, writing suggestions, and summaries.

“Help me write something” is pretty standard fare for generative AI these days, and Apple Intelligence does it as well as anyone else. You can have it make your text more friendly, professional, or concise. You can also create summaries of text or synthesize it into bulleted lists of key points or a table.

I’m finding these tools most useful in the Notes app, where you can now add voice recordings. In iOS 18, voice recordings finally come with automatic transcriptions, which is not an Apple Intelligence feature since it also works on my iPhone 13 Mini. But Apple Intelligence will let you turn a recording transcript into a summary or a checklist. This is helpful if you want to just free-associate while recording a memo and list a bunch of things you need to pack for an upcoming trip; Apple Intelligence turns it into a list that actually makes sense.

Honestly, this transcript is pretty good.
Apple Intelligence turned my rambling list into a neat little table.

These writing tools are tucked out of the way, and if you weren’t looking for them, you might miss them entirely. The more obvious new AI features are in the mail app. Apple Intelligence surfaces what it deems to be important emails in a card that sits above the rest of your inbox marked as priority. Below that, emails show a brief summary in place of the first line or two of text that you’d normally see.

There’s something charming about AI’s sincere attempt to summarize promotional emails, trying to helpfully pull out bits of detail like “Backpacks and lunch boxes ship FREE” and “Organic white nectarines are sweet and juicy, in season now.” But the descriptions in my inbox were accurate — helpful in a few instances and harmless at worst. And the emails it gave priority status to were genuinely important, which is promising.

The search tool in the Photos app now uses AI to understand more complicated requests. You can ask for pictures of a particular person wearing glasses or all the food you ate in Iceland, all in natural language.

It’s very good. Results come back fast and are generally reliable. It found the photo I had in mind of my kid wearing a pair of goofy glasses, though it also surfaced photos in which he appeared and someone else was wearing some glasses. Still, I think it’s bound to be a feature that people immediately get used to and don’t think twice about using — intuitive and obviously useful.

But despite the light show, Siri is about the same as ever. It mostly remains a “let me Google that for you” machine. The most significant updates are still to come in future updates when Siri will gain awareness of what’s on your screen and will be able to take action in apps for you. In theory, you’ll be able to have Siri grab information from messages and turn them into calendar events or retrieve information from email without you having to go digging through your inbox.

That’s the stuff I’m most excited about, and all of the pieces of Apple Intelligence available so far could be the building blocks of a better Siri. Apple’s AI is capable of understanding the contents of an email or a photo. Likewise, Siri is better at understanding how humans talk. In order for Apple Intelligence to prove itself, Siri needs to connect the dots.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Siri 苹果智能 iOS 18 语言理解 交互方式
相关文章