Mashable 2024年10月16日
Judge calls out 'expert witness' for using AI chatbot
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

在一宗房地产纠纷案件中,专家证人Charles Ranson使用微软的AI聊天机器人Copilot来生成损害评估。该案件涉及巴哈马一处价值48.5万美元的租赁房产纠纷,原告为死者儿子,被告为死者妹妹。Ranson虽有信托和遗产诉讼背景,但无相关房地产专业知识,其使用Copilot的行为及结果受到质疑,法院最终裁定死者妹妹未违反信托责任,且指出AI聊天机器人的使用并非首次在法庭上引发问题。

🧐专家证人Charles Ranson在房地产纠纷案件中使用Copilot生成损害评估,他虽有信托和遗产诉讼背景,但缺乏房地产专业知识。

📋案件涉及巴哈马一处租赁房产纠纷,死者儿子为原告,死者妹妹被指控违反信托责任,Ranson被要求评估损害。

💻Ranson使用Copilot后,法院测试发现Copilot的回答与Ranson提供的不同,且Ranson无法解释Copilot的工作原理及自己使用的提示。

🙅‍♂️此前已有律师在案件中因使用AI聊天机器人并引用不存在的案例而受到斥责,AI聊天机器人在法庭上的使用问题并非首次出现。

If you find yourself needing an expert witness in a courtroom case, make sure they're not using an AI chatbot for their supposed expertise.

Last week, a New York judge reprimanded an expert witness in a real estate dispute case for using Microsoft's AI chatbot Copilot. 

The expert witness, Charles Ranson, used Copilot in order to generate an assessment for damages that should be awarded to the plaintiff in the case. The case was first reported on by Ars Technica.

Copilot in court – a bad idea

The case at the center of this story involved a dispute over a $485,000 rental property in the Bahamas. The man who owned the real estate had passed away, and the property was included in a trust for the deceased man's son. The deceased man's sister was responsible for executing the trust. However, the sister was being accused of breaching her fiduciary duties by delaying the sale of the property while utilizing the property for her own personal use.

A major part in winning the case for the son was proving that he suffered damages due to his aunt's actions.

Ranson was brought on as an expert witness and tasked with assessing those damages.

While Ranson has a background in trust and estate litigation, according to judge Jonathan Schopf, he had "no relevant real estate expertise." So, Ranson turned to Microsoft's AI chatbot, Copilot.

Ranson apparently revealed his Copilot use in his testimony. When questioned about it, Ranson was unable to recall what prompts he used to assess the damages or what sources Copilot cited to arrive at its estimate. Ranson was also unable to explain how Copilot works.

The court then decided to use Copilot to see if it could arrive at the same estimate that Ranson provided. The court asked Copilot "Can you calculate the value of $250,000 invested in the Vanguard Balanced Index Fund from December 31, 2004 through January 31, 2021?"

Copilot provided a different answer in three different attempts, and each answer was different from Ranson's own Copilot-generated amount.

The court then asked Copilot if it was a reliable source of information, which Copilot replied by saying that its outputs should always be verified by experts.

According to the judge, Ranson was adamant that AI tools like Copilot were standard use in his industry, however he was unable to cite a single source showing this to be true.

Ranson's AI chatbot use wasn't his only mistake. However, the Copilot situation certainly hit the expert witness' credibility. The judge found that the evidence showed that the delay in the sale of the property not only didn't result in a loss, but additional profit for the son, and ruled there was no breach of fiduciary duty from the aunt.

Not the first time, and probably not the last time

Ranson's use of Copilot as some expert source of information is certainly not the first time AI chatbots have been used in the courtroom.

Readers may recall lawyer Steven Schwartz who last year relied on ChatGPT in legal filings for a case involving an airline customer being injured during a flight. Schwartz was reprimanded after submitting filings which cited completely nonexistent cases. Schwartz had used ChatGPT for his research, and the AI chatbot just made up previous cases, which Schwartz then included in his filings.

As a result, Schwartz and another lawyer at the firm he worked for were fined $5,000 by the court for "acting in bad faith."

The same scenario happened again with another lawyer, Jae Lee, who used ChatGPT in her filings earlier this year. Once again ChatGPT hallucinated cases that did not exist.

In the Bahamas real estate case, Judge Schopf made a point not to blame the AI chatbot but the user for citing it. However, AI chatbots continue to proliferate online and major tech companies like Google and Microsoft are ramping up promotion of this technology to users.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI聊天机器人 法庭案例 房地产纠纷 专家证人
相关文章