TechCrunch News 2024年11月19日
Microsoft will soon let you clone your voice for Teams meetings
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

微软计划在Teams中推出名为Interpreter的功能,允许用户克隆自己的声音,并在会议中将其翻译成多种语言。该功能将于2025年初推出,支持英、法、德、意、日、韩、葡、中文和西班牙语等九种语言。这项技术旨在提供更个性化和更具吸引力的会议体验,但同时也引发了人们对深度伪造和安全风险的担忧。微软表示,该工具不会存储生物识别数据,也不会添加超出语音本身的情感,并且可以通过Teams设置禁用。尽管该功能应用范围相对较窄,但仍存在被滥用的风险,例如被用于欺诈或散布虚假信息。

🤔微软Teams即将推出Interpreter功能,该功能允许用户克隆自己的声音,并在会议中将其翻译成多达9种语言,包括英语、法语、德语、意大利语、日语、韩语、葡萄牙语、普通话和西班牙语。

🗣️Interpreter旨在通过实时语音到语音翻译,为用户提供更个性化和更具吸引力的会议体验,用户可以选择让系统模拟自己的声音进行翻译。

🛡️微软强调Interpreter不会存储生物识别数据,也不会添加额外的情感,并且可以通过Teams设置禁用,以确保用户隐私和安全。

⚠️但该功能也存在被滥用的风险,例如恶意用户可以利用该功能生成虚假信息或进行欺诈,因此需要谨慎使用并加强安全防护。

💰市场研究表明,自然语言处理技术,包括翻译技术,到2026年可能价值351亿美元,这表明对AI语音翻译的需求正在增长。

Microsoft plans to let Teams users clone their voices so they can have their sound-alikes speak to others in meetings in different languages.

At Microsoft Ignite 2024 on Tuesday, the company revealed Interpreter in Teams, a tool for Microsoft Teams that delivers “real-time, speech-to-speech” interpretation capabilities. Starting in early 2025, people using Teams for meetings will be able to use Interpreter to simulate their voices in up to nine languages: English, French, German, Italian, Japanese, Korean, Portuguese, Mandarin Chinese, and Spanish.

“Imagine being able to sound just like you in a different language,” Microsoft CMO Jared Spataro wrote in a blog post shared with TechCrunch. “Interpreter in Teams provides real-time speech-to-speech translation during meetings, and you can opt to have it simulate your speaking voice for a more personal and engaging experience.”

Microsoft gave few concrete details about the feature, which will only be available to Microsoft 365 subscribers. But it did say that the tool doesn’t store any biometric data, doesn’t add sentiments beyond what’s “naturally present” in a voice, and can be disabled through Teams settings.

“Interpreter is designed to replicate the speaker’s message as faithfully as possible without adding assumptions or extraneous information,” a Microsoft spokesperson told TechCrunch. “Voice simulation can only be enabled when users provide consent via a notification during the meeting or by enabling ‘Voice simulation consent’ in settings.”

A number of firms have developed tech to digitally mimic voices that sound reasonably natural. Meta recently said that it’s piloting a translation tool that can automatically translate voices in Instagram Reels, while ElevenLabs offers a robust platform for multilingual speech generation.

AI translations tend to be less lexically rich than those from human interpreters, and AI translators often struggle to accurately convey colloquialisms, analogies and cultural nuances. Yet, the cost savings are attractive enough to make the trade-off worth it for some. According to Markets and Markets, the sector for natural language processing technologies, including translation technologies, could be worth $35.1 billion by 2026.

AI clones also pose security challenges, however.

Deepfakes have spread like wildfire across social media, making it harder to distinguish truth from disinformation. So far this year, deepfakes featuring President Joe BidenTaylor Swift, and Vice President Kamala Harris have racked up millions of views and reshares. Deepfakes have also been used to target individuals, for example by impersonating loved ones. Losses linked to impersonation scams topped $1 billion last year, per the FTC.

Just this year, a team of cybercriminals reportedly staged a Teams meeting with a company’s C-level staff that was so convincing that the target company wired $25 million to the criminals.

In part due to the risks (and optics), OpenAI earlier this year decided against releasing its voice cloning tech, Voice Engine.

From what’s been revealed so far, Interpreter in Teams is a relatively narrow application of voice cloning. Still, that doesn’t mean the tool will be safe from abuse. One can imagine a bad actor feeding Interpreter a misleading recording — for example, someone asking for bank account information — to get a translation in the language of their target.

Hopefully, we’ll get a better idea of the safeguards Microsoft will add around Interpreter in the months to come.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

微软Teams 语音克隆 AI翻译 深度伪造 安全风险
相关文章