TechCrunch News 02月25日
UK’s internet watchdog toughens approach to deepfake porn
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

英国互联网安全监管机构Ofcom发布新指南,旨在帮助企业履行《网络安全法案》(OSA)规定的法律义务,保护女性和女孩免受网络骚扰、欺凌、厌女症和亲密图像滥用等威胁。该指南强调“安全设计”方法,建议科技公司在产品设计中融入安全考虑。Ofcom建议采取包括移除默认地理位置、进行“可滥用性”测试、加强账户安全、设计用户提示以及提供便捷举报工具等措施。尽管OSA面临执行缓慢和效果不确定的批评,但Ofcom仍致力于推动平台采取行动,并通过透明度和信息收集来提高用户意识,对未能有效保护女性安全的平台进行公开曝光。

🛡️ 《网络安全法案》(OSA)旨在保护女性和女孩免受网络骚扰、欺凌、厌女症和亲密图像滥用等威胁,某些形式的厌女症虐待,如未经同意分享亲密图像或使用AI工具创建针对个人的deepfake色情内容,被明确列为执法重点。

💡 Ofcom 建议采用“安全设计”方法,鼓励科技公司在产品设计中融入安全考虑,例如,在设计阶段就应采取措施,以减少工具被武器化来针对女性和女孩的风险,尤其是在图像生成AI服务中,应主动采取措施限制deepfake亲密图像滥用的风险。

🔒 Ofcom 强调了一些行业“良好”实践的例子,包括默认移除地理位置以减少隐私/跟踪风险,进行‘可滥用性’测试以识别服务可能被武器化/滥用的方式,采取措施加强账户安全,设计用户提示以促使发帖者在发布辱骂性内容前三思,以及提供便捷的举报工具。

🔍 Ofcom 计划利用 OSA 赋予的透明度和信息收集权力,揭示行业行动的影响,提高用户意识,并公开未能有效保护女性安全的平台,以此推动平台采取行动,同时,Ofcom 还将发布市场报告,展示不同平台上的保护措施,使用户能够就其在线时间做出明智的选择。

Ofcom, the U.K.’s internet safety regulator, has published another new draft guidance as it continues to implement the Online Safety Act (OSA) — the latest set of recommendations aim to support in-scope firms to meet legal obligations to protect women and girls from online threats like harassment and bullying, misogyny, and intimate image abuse.

The government has said that protecting women and girls is a priority for its implementation of the OSA. Certain forms of (predominantly) misogynist abuse — such as sharing intimate images without consent or using AI tools to create deepfake porn that targets individuals — are explicitly set out in the law as enforcement priorities.

The online safety regulation, which was approved by the U.K. parliament back in September 2023, has faced criticism that it’s not up to the task of reforming platform giants, despite containing substantial penalties for non-compliance — up to 10% of global annual turnover.

Child safety campaigners have also expressed frustration over how long it’s taking to implement the law, as well as doubting whether it will have the desired effect.

In an interview with the BBC in January, even the technology minister Peter Kyle — who inherited the legislation from the previous government — called it “very uneven” and “unsatisfactory.” But the government is sticking with the approach. Part of the discontent around the OSA can be traced back to the long lead time ministers allowed for implementing the regime, which requires parliament to approve Ofcom compliance guidance.

However, enforcement is expected to start to kick in soon in relation to core requirements on tackling illegal content and child protection. Other aspects of OSA compliance will take longer to implement. And Ofcom concedes this latest package of practice recommendations won’t become fully enforceable until 2027 or later.

“The first duties of the Online Safety Act are coming into force next month,” Ofcom’s Jessica Smith, who led development of the female safety-focused guidance, told TechCrunch in an interview. “So we will be enforcing against some of the core duties of the Online Safety Act ahead of this guidance [itself becoming enforceable].”

The new draft guidance on keeping women and girls safe online is intended to supplement earlier broader Ofcom guidance on illegal content — which also, for example, provides recommendations for protecting minors from seeing adult content online.

In December, the regulator published its finalized guidance on how platforms and services should shrink risks related to illegal content, an area where child protection is a clear priority.

It has also previously produced a Children’s Safety Code, which recommends online services dial up age checks and content filtering to ensure kids are not exposed to inappropriate content such as pornography. And as it’s worked toward implementing the online safety regime, it’s also developed recommendations for age assurance technologies for adult content websites, with the aim of pushing porn sites to take effective steps preventing minors from accessing age-inappropriate content.

The latest set of guidance was developed with help from victims, survivors, women’s advocacy groups and safety experts, per Ofcom. It covers four major areas where the regulator says females are disproportionately affected by online harm — namely: online misogyny; pile-ons and online harassment; online domestic abuse; and intimate image abuse.

Ofcom’s top-line recommendation urges in-scope services and platforms to take a “safety by design” approach. Smith told us the regulator wants to encourage tech firms to “take a step back” and “think about their user experience in the round.” While she acknowledged some services have put in place some measures that are helpful in shrinking online risks in this area, she argued there’s still a lack of holistic thinking when it comes to prioritizing the safety of women and girls.

“What we’re really asking for is just a sort of step change in how the design processes work,” she told us, saying the goal is to ensure that safety considerations are baked into product design.

She highlighted the rise of image generating AI services, which she noted have led to “massive” growth in deepfake intimate image abuse as an example of where technologists could have taken proactive measures to crimp the risks of their tools being weaponized to target women and girls — yet did not.

“We think that there are sensible things that services could do at the design phase which would help to address the risk of some of those harms,” she suggested.

Examples of “good” industry practices Ofcom highlights in the guidance includes online services taking actions such as:

  • Removing geolocation by default (to shrink privacy/stalking risks);
  • Conducting ‘abusability’ testing to identify how a service could be weaponized/misused;
  • Taking steps to boost account security;
  • Designing in user prompts that are intended to make posters think twice before posting abusive content;
  • And offering accessible reporting tools that let users report issues.

As is the case with all Ofcom’s OSA guidance not every measure will be relevant for every type or size of service — since the law applies to online services large and small, and cuts across various arenas from social media, to online dating, gaming, forums and messaging apps, to name a few. So a big part of the work for in-scope companies will be understanding what compliance means in the context of their product.

When asked if Ofcom had identified any services currently meeting the guidance’s standards, Smith suggested they had not. “There’s still a lot of work to do across the industry,” she said.

She also tacitly acknowledged that there may be growing challenges given some of the retrograde steps taken vis-à-vis trust and safety by some major industry players. For example, since taking over Twitter and rebranding the social network as X, Elon Musk has gutted its trust and safety headcount — in favor of pursuing what he has framed as a maximalist approach to free speech.

In recent months, Meta — which owns Facebook and Instagram — appears to have taken some mimicking steps, saying it’s ending thirty-party fact-checking contracts in favor of deploying an X-style “community notes” system of crowdsourced labelling on content disputes, for example.

Smith suggested that Ofcom’s response to such high-level shifts — where operators’ actions could risk dialling up, rather than damping down, online harms — will focus on using transparency and information-gathering powers it wields under the OSA to illustrate impacts and drive user awareness.

So, in short, the tactic here looks set to be ‘name and shame’ — at least in the first instance.

“Once we finalize the guidance, we will produce a [market] report … about who is using the guidance, who is following what steps, what kind of outcomes they’re achieving for their users who are women and girls, and really shine a light on what protections are in place on different platforms so that users can make informed choices about where they spend their time online,” she told us.

Smith suggested that companies wanting to avoid the risk of being publicly shamed for poor performance on women’s safety will be able to turn to Ofcom’s guidance for “practical steps” on how to improve the situation for their users, and address the risk of reputational harm too.

“Platforms that are operating in the UK will have to comply with the UK law,” she added in the context of the discussion on major platforms de-emphasizing trust and safety. “So that means complying with the illegal harms duties and the protection of children duties under the Online Safety Act.”

“I think this is where our transparency powers also come in — if the industry is changing direction and harms are increasing, this is where we will be able to shine a light and share relevant information with UK users, with media, with parliamentarians.”

One type of online harm where Ofcom is explicitly beefing up its recommendations even before it’s actively started OSA enforcement is intimate image abuse — as the latest draft guidance suggests the use hash matching to detect and remove such abusive imagery, whereas earlier Ofcom recommendations did not go that far.

“We’ve included additional steps in this guidance that go beyond what we’ve already set out in our codes,” Smith noted, confirming Ofcom plans to update its earlier codes to incorporate this change “in the near future.”

“So this is a way of saying to platforms that you can get ahead of that enforceable requirement by following the steps that are set down in this guidance,” she added.

Ofcom recommended the use of hash matching technology to counter intimate image abuse due to a substantial increase in this risk, per Smith — especially in relation to AI-generated deepfake image abuse.

“There was more deepfake intimate image abuse reported in 2023 than in all previous years combined,” she noted, adding that Ofcom has also gathered more evidence on the effectiveness of hash matching to tackle this harm.

The draft guidance as a whole will now undergo consultation — with Ofcom inviting feedback until May 23, 2025 — after which it will produce final guidance by the end of this year.

A full 18 months after that, Ofcom will then produce its first report reviewing industry practice in this area.

“We’re getting into 2027 before we’re producing our first report on who’s doing what [to protect women and girls online] — but there’s nothing to stop platforms acting now,” she added.

Responding to criticism that the OSA is taking Ofcom too long to implement, she said it’s right that the regulator consults on compliance measures. However, with the final measure taking effect next month, she noted that Ofcom anticipates a shift in the conversation surrounding the issue, too.

“[T]hat will really start to change the conversation with platforms, in particular,” she predicted, adding that it will also be in a position to start demonstrating progress on moving the needle when it comes to reducing online harms.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Ofcom 网络安全 女性保护 OSA法案 AI滥用
相关文章