Mashable 前天 18:04
MyPillow CEOs lawyers file AI-generated legal brief riddled with errors
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

MyPillow首席执行官Mike Lindell的律师因使用生成式AI撰写法律简报而面临纪律处分,该简报充斥着基本错误。尽管律师承认使用了AI,但声称这些错误主要源于人为因素。科罗拉多州地方法院法官Nina Wang指出,律师提交的简报中存在近30处引文错误,包括错误引用案例、歪曲法律原则,甚至引用不存在的案例。法院要求律师解释为何不应受到纪律处分,并强调了法律界对AI使用的不当行为。

🤔 法院发现MyPillow首席执行官的律师在提交的法律简报中存在大量错误,包括错误引用案例和引用不存在的案例。

✍️ 律师承认使用了生成式AI撰写简报,但辩称错误是由于人为失误,而非AI本身的问题。

⚠️ 法院命令律师解释为何不应受到纪律处分,并指出这并非法律界首次因不当使用AI而引发争议。

🧐 律师辩称使用AI撰写法律文件并无不妥,但强调需要正确使用,并承认未核对AI生成的引文。

Lawyers for MyPillow CEO and presidential election conspiracy theorist Mike Lindell are facing potential disciplinary action after using generative AI to write a legal brief, resulting in a document rife with fundamental errors. The lawyers did admit to using AI, but claim that this particular mistake was primarily human.

On Wednesday, an order by Colorado district court judge Nina Wang noted that the court had identified almost 30 defective citations in a brief filed by Lindell's lawyers on Feb. 25. Signed by attorneys Christopher Kachouroff and Jennifer DeMaster of law firm McSweeney Cynkar and Kachouroff, the filing was part of former Dominion Voting Systems employee Eric Coomer's defamation lawsuit against Lindell.

"These defects include but are not limited to misquotes of cited cases; misrepresentations of principles of law associated with cited cases, including discussions of legal principles that simply do not appear within such decisions; misstatements regarding whether case law originated from a binding authority such as the United States Court of Appeals for the Tenth Circuit; misattributions of case law to this District; and most egregiously, citation of cases that do not exist," read Wang's court order.

The court further noted that while the lawyers had been given the opportunity to explain this laundry list of errors, they were unable to adequately do so. Kachouroff confirmed that he'd used generative AI to prepare the brief once directly asked about it by the court, and upon further questioning admitted that he had not checked the resultant citations.

As such, the court ordered the lawyers to provide an explanation as to why Kachouroff and DeMaster should not be referred to disciplinary proceedings for violating professional conduct rules, as well as sanctioned alongside Lindell and their law firm.


Featured Video For You
Figure’s humanoid robot will do your chores with voice commands

Lawyers may face disciplinary action over use of AI

Responding to the order on Friday, the lawyers stated that they had been "unaware of any errors or issues" with their filing, so were "caught off-guard" and unprepared to explain themselves when initially questioned by the court. 

Having now had time to assess the situation, they now claim that the document in question was actually an earlier draft which DeMaster had filed by mistake. Submitting alternate versions of the brief in support of this argument, the lawyers also presented an email exchange between Kachouroff and DeMaster in which they discussed edits.

"At that time, counsel had no reason to believe that an AI-generated or unverified draft had been submitted," read their response. "After the hearing and having a subsequent opportunity to investigate [the brief], it was immediately clear that the document filed was not the correct version. It was a prior draft.

"It was inadvertent, an erroneous filing that was not done intentionally, and was filed mistakenly through human error."

The lawyers further contend in their filing that it is perfectly permissible to use AI to prepare a legal filing, arguing that "[t]here is nothing wrong with using AI when used properly." Kachouroff stated that he "routinely" analyses legal arguments using AI tools such as Microsoft’s Co-Pilot, Google (presumably Gemini), and X (presumably Grok), though noted that he is the only person at his law firm to do so. He also stated that he had never heard the term "generative artificial intelligence" before.

The lawyers asked that they be allowed to refile their corrected brief, as well as that the potential disciplinary action be dismissed. 

This incident is just the latest in a growing list of legal professionals inappropriately using AI in their work, some of them not even understanding the technology. In June 2023, two attorneys were fined for citing non-existent legal cases after they'd used ChatGPT to do their research. Later that year, a lawyer for disbarred former Trump attorney Michael Cohen was caught citing fake cases said client had generated with Google Bard. Then in February, yet another attorney appeared to cite cases fabricated by ChatGPT, prompting their law firm Morgan & Morgan to warn employees against blindly trusting AI.

Yet despite such cautionary tales, it seems that many lawyers still haven't gotten the message to steer clear.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

AI 法律 律师 MyPillow
相关文章