Fortune | FORTUNE 2024年11月21日
Renter scoring firm agrees to pay $2.2 million to settle case accusing its algorithm of discriminating on race and income
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文讲述了黑人女性Mary Louis因使用算法评分的租房申请被拒,最终将其告上法庭的故事。该算法由SafeRent Solutions公司开发,被指控在租房申请评估中存在种族和收入歧视。诉讼最终达成和解,SafeRent公司同意支付220万美元并修改其算法,但未承认存在歧视。此案引发了人们对人工智能算法在社会决策中可能存在的歧视问题以及监管必要性的关注,也为人工智能算法的责任承担提供了新的案例。

🤔 **算法歧视导致租房申请被拒:**Mary Louis因算法评分被拒租,引发了针对SafeRent Solutions公司及其算法的歧视诉讼,该算法被指控在租房申请评估中存在种族和收入歧视,例如未考虑住房券的福利,过分依赖信用信息等。

🏢 **算法在租房申请中的应用:**人工智能算法在租房申请中被广泛使用,用于评估申请人的信用、收入、历史记录等,帮助房东或物业公司筛选租客。然而,这些算法缺乏监管,可能导致歧视性结果。

⚖️ **诉讼结果与和解协议:**此案最终达成和解,SafeRent公司同意支付220万美元并修改其算法,包括在某些情况下停止使用评分功能,并需对新开发的评分进行第三方验证。但SafeRent公司并未承认存在歧视。

⚠️ **人工智能算法的监管问题:**此案凸显了人工智能算法在社会决策中可能存在的歧视问题,以及缺乏监管的风险。虽然一些州立法者提出了相关监管提案,但大多未能获得足够支持。

✊ **算法责任与公平:**诉讼中,SafeRent公司辩称其算法只是提供评估报告,最终决定权在房东手中,但原告方和司法部认为算法在租房申请中起着关键作用,应承担责任。法院最终驳回了SafeRent公司的驳回诉讼请求。

Mary Louis’ excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a “third-party service” had denied her tenancy.That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income.A federal judge approved a settlement in the lawsuit, one of the first of it’s kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it’s screening products that the lawsuit alleged were discriminatory.The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it “continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive.”While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn’t. For years, AI has been furtively helping make consequential decisions for U.S. residents.When a person submits a job application, applies for a home loan or even seeks certain medical care, there’s a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate.“Management companies and landlords need to know that they’re now on notice, that these systems that they are assuming are reliable and good are going to be challenged,” said Todd Kaplan, one of Louis’ attorneys.The lawsuit alleged SafeRent’s algorithm didn’t take into account the benefits of housing vouchers, which they said was an important detail for a renter’s ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid.The suit also accused SafeRent’s algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant’s ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities.Christine Webber, one of the plaintiff’s attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have “the same effect as if you told it to discriminate intentionally.”When Louis’ application was denied, she tried appealing the decision, sending two landlords’ references to show she’d paid rent early or on time for 16 years, even if she didn’t have a strong credit history.Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter.The response from the management company, which used SafeRent’s screening service, read, “We do not accept appeals and cannot override the outcome of the Tenant Screening.”Louis felt defeated; the algorithm didn’t know her, she said.“Everything is based on numbers. You don’t get the individual empathy from them,” said Louis. “There is no beating the system. The system is always going to beat us.”While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis’ are starting to lay the groundwork for AI accountability.SafeRent’s defense attorneys argued in a motion to dismiss that the company shouldn’t be held liable for discrimination because SafeRent wasn’t making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant.Louis’ attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent’s algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent’s motion to dismiss on those counts.The settlement stipulates that SafeRent can’t include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to.Louis’ son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area.“I’m not optimistic that I’m going to catch a break, but I have to keep on keeping, that’s it,” said Louis. “I have too many people who rely on me.”How many degrees of separation are you from the globe's most powerful business leaders? Explore who made our brand-new list of the 100 Most Powerful People in Business. Plus, learn about the metrics we used to make it.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

算法歧视 人工智能 租房 歧视 算法监管
相关文章