Wed 15 Nov 2023

Hallucinations in the Legal Landscape: 2023 Word of the Year Edition

In a world where language evolves alongside technology, the Cambridge Dictionary's choice for Word of the Year often reflects the zeitgeist of the times. This year, the Dictionary pronounced "hallucinate" as the Word of the Year, a term that resonates deeply in the context of the legal landscape transformed by artificial intelligence (“AI”).

AI in Legal Services: A Paradigm Shift

The selection of "hallucinate" as the Word of the Year is a noteworthy indicator of the changing nature of reality, both linguistically and metaphorically. While traditionally associated with the perception of unreal images, sounds, or sensations, "hallucinate" now includes the definition “when an artificial intelligence (AI) hallucinates, it produces false information.” In the legal sector, the word takes on a new significance as AI technologies reshape the very fabric of legal services.

The legal profession has historically been marked by painstaking research, vast document reviews, and complex case analyses. The advent of AI has introduced a paradigm shift, which has the potential to streamline these processes and enhance the efficiency of legal practitioners. Natural language processing algorithms can now analyse and extract information from mountains of legal documents at speeds unimaginable for human counterparts. It might well be that we are at the cusp of a new era of legal practice where information is not just processed but understood with an unprecedented speed and depth.

Legal Research and Analysis: Beyond Human Capabilities

AI's prowess in legal research and analysis is akin to a virtual legal assistant with an encyclopaedic knowledge of both case law and statutes. Leveraging machine learning algorithms, legal professionals can now sift through vast databases to find relevant information, identify patterns, and even predict potential outcomes. The ability to process information at such a scale might not only expedite the provision of legal services but also minimises the risk of oversight, by ensuring a more thorough and accurate representation of legal contexts.

In fact, an English Court of Appeal Judge admitted last month that he had used AI for the purposes of writing a judgment. Not only did Lord Justice Birss use ChatGPT, he also copied and pasted the output of ChatGPT into his opinion as he recognised the answer provided by the AI tool as “acceptable”. If last year someone told the legal profession that judges will soon be using AI tools to generate court decisions, the prediction would have likely been met with a solid dosage of scepticism. And yet the technology is now openly being used by the Judiciary, though, according to Sky News, a spokesperson for the Judicial Office for England & Wales did not confirm the extent of any such usage.

Considerations and Challenges

As AI becomes an integral part of legal services, ethical considerations and challenges have inevitably emerged. Creations are bound to be as flawed as their creators. Some of the most serious ones include: the potential biases encoded in algorithms, the need for transparent decision-making processes, and concerns about the job displacement of legal professionals. Balancing the benefits of AI with the preservation of human-centric values in the legal system is central to achieving a harmonious integration of the new technology in the justice sector.

On a more practical level, in one case in the US State of New York, the use of AI was also a cautionary tale of the limitations of the technology when it comes to its application in legal services. In Mata v Avianca, the plaintiff’s lawyer used ChatGPT for the purposes of preparing a legal note of argument. The note referred to a number of cases which had been supplied to him by the AI tool. When he asked ChatGPT to confirm that the cases were real, the tool confirmed that it had hallucinated one of the cases but that the other ones were “real and [could] be found in reputable legal databases.” The lawyer appears to have relied on this without further inquiry. But the attorneys for Avianca were not so trusting and were unable to find the cases in question. Judge Castel, before whom the case called, was similarly unsuccessful. Unfortunately, it appears that instead of being upfront about the use of the AI tool, the lawyer was not forthcoming, which incurred the wrath of the court and ultimately resulted in a fine of $15,000. The lawyer and his firm were also ordered to inform each judge falsely identified as the author of the hallucinated opinions about Judge Castel’s order. The judge further noted: “Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance…But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.” Ouch!

Conclusion

The Cambridge Dictionary's Word of the Year, "hallucinate", serves as a timely reflection of the transformative impact of AI on legal services. As the legal landscape continues to evolve, embracing the advantages of AI while addressing ethical and practical considerations is crucial. The integration of technology into legal practice cannot come at the cost of basic principles of professionalism and competency. In using technology, a lawyer should not forgo their crucial thinking – it is not something capable of delegation. The alternative would see hallucinations abound and confidence in the legal profession diminish.

Make an Enquiry

From our offices we serve the whole of Scotland, as well as clients around the world with interests in Scotland. Please complete the form below, and a member of our team will be in touch shortly.

Morton Fraser MacRoberts LLP will use the information you provide to contact you about your inquiry. The information is confidential. For more information on our privacy practices please see our Privacy Notice