When seeking an expert witness for courtroom cases, ensuring their credibility is paramount, especially when it comes to reliance on AI technology.
A New York judge recently reprimanded an expert witness in a real estate dispute for utilizing an AI chatbot to form their expert opinion.
The expert witness in question, Charles Ranson, employed an AI tool to generate an assessment regarding damages to be awarded to the plaintiff. The case involved a $485,000 rental property in the Bahamas following the passing of its owner, which was tied to a trust for his son. The deceased man’s sister faced accusations of breaching her fiduciary responsibilities by delaying the property’s sale for personal use.
Expert Testimony Questioned
Ranson’s role was to assess the damages incurred by the son due to the actions of his aunt. Despite Ranson’s background in trust and estate litigation, the presiding judge noted a lack of relevant real estate expertise. As a result, Ranson opted to use an AI chatbot for his assessment.
During testimony, Ranson disclosed his reliance on the AI tool but struggled to provide details about the prompts used or the sources cited for the damages estimate. Additionally, he failed to explain how the AI technology operates.
In a remarkable turn, the court decided to test the AI tool by asking it to calculate the value of an investment over a specified period. Copilot returned varying estimates on three attempts, all differing from Ranson’s assessment.
When questioned about its reliability, the AI confirmed that its outputs should be verified by qualified experts, highlighting the limitations of using such technology without proper oversight.
The judge ultimately ruled that the evidence showed the delay in the property’s sale did not result in losses but instead led to increased profits for the son, dismissing claims of breach of fiduciary duty against the aunt.
Patterns of AI Misuse in Legal Proceedings
This incident underscores a growing trend of AI chatbots in courtroom settings. In a notable previous case, a lawyer faced penalties for relying on ChatGPT, which fabricated legal precedents that were cited in court documents. Similar missteps have occurred with other legal professionals, leading to significant repercussions.
While the judge clarified that the blame should not fall on the AI itself, the increasing integration of such technologies into legal proceedings raises critical questions about their reliability and the accountability of their users.
Topics
Artificial Intelligence
Legal Issues