Chatgpt And The Law: Ai’s Negative Impact On The Administration Of Justice
Posted August 19, 2023 in Uncategorized
“A LAWYER, AS A MEMBER OF THE LEGAL PROFESSION, IS A REPRESENTATIVE OF CLIENTS, AN OFFICER OF THE LEGAL SYSTEM AND A PUBLIC CITIZEN HAVING SPECIAL RESPONSIBILITY FOR THE QUALITY OF JUSTICE.”
“[A] LAWYER SHOULD SEEK IMPROVEMENT OF THE LAW, ACCESS TO THE LEGAL SYSTEM, THE ADMINISTRATION OF JUSTICE AND THE QUALITY OF SERVICE RENDERED BY THE LEGAL PROFESSION.”
“[A] LAWYER SHOULD FURTHER THE PUBLIC’S UNDERSTANDING OF AND CONFIDENCE IN THE RULE OF LAW AND THE JUSTICE SYSTEM BECAUSE LEGAL INSTITUTIONS IN A CONSTITUTIONAL DEMOCRACY DEPEND ON POPULAR PARTICIPATION AND SUPPORT TO MAINTAIN THEIR AUTHORITY.”
“A LAWYER SHALL PROVIDE COMPETENT REPRESENTATION TO A CLIENT. COMPETENT REPRESENTATION REQUIRES THE LEGAL KNOWLEDGE, SKILL, THOROUGHNESS AND PREPARATION REASONABLY NECESSARY FOR THE REPRESENTATION.”
Let these quotes–directly from the Colorado Rules of Professional Conduct–sink in for a minute.
I recently saw an article admonishing aspiring lawyers to forget about going to law school. Don’t go to law school, the article says, because artificial intelligence (AI) can do the job of a lawyer. Unsurprisingly, this advice is coming from legal tech investors. Consider your source and its motives.
Whether you want to take on six-figure law school debt for a career in law is up to you, and should be carefully considered against the very real likelihood you can make the same or better salary in many other careers, even blue-collar careers. All the internet bloggers have their opinions on career ROI. Doesn’t matter. Do the career you want to do. But don’t shrink from it because AI investors tell you AI will replace lawyers (so they can convince the world to buy their legal tech). Likewise, I’m not here to tell you that ChatGPT can’t provide valuable/augmented legal solutions. Either way, the AI/ChatGPT legal tech snowball is growing, creates potential for misuse and ethical pitfalls, and has to be discussed.
In an effort to cut costs and time, perhaps even passing on savings to the client, some attorneys and law firms around the globe are embracing the purported “all-knowingness” of the AI platform. But, slow down, are there any dangers here? In this blog, we explore the woes of ChatGPT and the negative effects that it has had on the legal profession over the span of just a few short months.
CHAT GPT IN THE UNITED STATES
Perhaps the most well-known misuse of ChatGPT within legal circles occurred in a New York federal court case in early May of 2023. Two lawyers in New York apparently submitted a case brief to the court which cited to six nonexistent cases.[1] According to an affidavit signed by one of the attorneys following the discovery of the nonexistent cases by both opposing counsel and the presiding judge, the attorney stated he “initially believed the tool had surfaced authentic citations.”[2] He had used ChatGPT to conduct legal research for the personal injury case he was working on and was “unaware of the possibility that its content could be false.”[3]
To the District Judge, however, the attorney’s citations to “nonexistent cases” and “bogus quotes” were not harmless.[4] Both lawyers faced a June 8, 2023 hearing on potential sanctions and were hit with a hefty monetary fine.[5]
The judge indicated that while there was nothing “inherently improper” about the use of ChatGPT, the professional conduct rules governing lawyers “impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”[6]
And, when these antics caught the eye of a jurisdiction halfway across the country – the Northern District of Texas – U.S. District Judge Brantley Starr ordered attorneys “to attest that they will not use ChatGPT or other generative artificial intelligence technology to write legal briefs because the AI tool can invent facts.”[7] The court will no longer accept any filings that are not accompanied by the sworn attestation.
“WHILE ATTORNEYS SWEAR AN OATH TO SET ASIDE THEIR PERSONAL PREJUDICES, BIASES, AND BELIEFS TO FAITHFULLY UPHOLD THE LAW AND REPRESENT THEIR CLIENTS, GENERATIVE ARTIFICIAL INTELLIGENCE IS THE PRODUCT OF PROGRAMMING DEVISED BY HUMANS WHO DID NOT HAVE TO SWEAR SUCH AN OATH.”
Judge Starr clarified in his ban that “these platforms are incredibly powerful and have many uses in the law: form divorces, discovery requests, suggested errors in documents, anticipated questions at oral argument. But legal briefing is not one of them.”[8] He stated further that “these platforms in their current states are prone to hallucinations and bias. On hallucinations, they make stuff up – even quotes and citations . . . another issue is reliability or bias. While attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients, generative artificial intelligence is the product of programming devised by humans who did not have to swear such an oath.”[9]
CHATGPT IN COLORADO
This is uncomfortable. It’s happening in Colorado as well. One recent article picked up a case out of Colorado Springs this summer where a local attorney found himself in a pickle when the judge caught on that his motion contained fabricated legal authority.[10] However, unlike the New York attorneys mentioned above, the Colorado attorney apparently caught the fabricated cases after he filed the motion and owned up to the mistake. Obviously, the local attorney probably learned the lesson of a lifetime here.
But these mistakes also affect the judiciary and make a judge’s role more difficult. Judges should be able to rely on attorneys’ legal briefing, not mistrust it. Apparently, the judge in this case felt it necessary to specify that his denial of the motion was not based on the fictitious case law. [11] No doubt — careless ChatGPT use will place a heavier burden on judges than they already have. Now, judicial staff has to spend more of their incredibly limited time digging deeper to ensure fake case law isn’t being propounded. It can be subtle. What a danger to our common law system of law that some how AI-generated case law works its way through the system unnoticed.
As a lawyer, I do everything I can to find the most relevant, trustworthy, and proper legal authority to support my client’s position. This might even mean writing off my time on extra legal research and double/triple checking what we found in the case law. Call it paranoia if you want. My clients spend hard-earned money on us to competently and effectively litigate their cases, which necessarily includes us maintaining candor and trust with the judge. Now we have to worry about the diminishing of trust between the judiciary and the private bar due to sloppy use of AI-fabricated authorities. Let’s hope the profession takes heed and this blog becomes irrelevant in a year.
CHATGPT ABROAD
The ChatGPT craze is not confined within US borders; indeed, on the other side of the ocean, New Zealand has faced its own controversies concerning the AI tool.
Specifically, the New Zealand Law Society revealed in a March 2023 member newsletter that “its librarians had received requests from legal practitioners for information on fictitious cases – presumably from people who had been using tools like ChatGPT.”[12]
The NZ Law Society says “the cases appear real as ChatGPT has learnt what a case name and citation should look like. However, investigations by library staff have found that the cases requested were fictitious. It will fabricate facts and sources where it does not have access to sufficient data.”[13]
ETHICAL IMPLICATIONS OF USING CHATGPT
While the Colorado Rules of Professional Conduct do not currently account for artificial intelligence, several existing ethics rules likely apply.
-
DUTY OF COMPETENCE (RULE 1.1)
“A lawyer shall provide competent representation to a client. Competent representation requires the legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation.” We cannot rely on AI to do this for us. Even if we do use AI, Rule 1.1 does not apply to AI, it applies to the lawyer. AI does not have a license to practice law. The lawyer is responsible for double-checking and ensuring the accuracy of the information found on the internet before blindly applying it in a legal brief. Let’s not forget who the humans are in the room here.
-
COMMUNICATION (RULE 1.4)
“A Lawyer shall: … (5) consult with the client about any relevant limitation on the lawyer’s conduct when the lawyer knows that the client expects assistance not permitted by the Rules of Professional Conduct or other law.”
If a client expects cheaper briefing and cost-savings on contracts drafting by use of AI, the lawyer must communicate to the client whether the lawyer is limited by the rules of professional conduct as to such expected use. This would especially ring true if the client pushes the lawyer to turn a blind eye to fictitious cites or poor legal analysis to gain an advantage in the case.
-
DUTY OF CONFIDENTIALITY (RULE 1.6)
This rule requires lawyers to “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.” The comments to this rule talk about implementing security measure to protect confidential information. With ChatGPT, lawyers may be willingly giving AI companies their clients’ data to train and improve their models. Consider very carefully that you are giving your client’s case facts to the legal tech industry for its profit when you agree to “converse” with ChatGPT about the case to identify case law.
-
DUTY TO ADVISE (RULE 2.1)
“In representing a client, a lawyer shall exercise independent professional judgment and render candid advice. In rendering advice, a lawyer may refer not only to law but to other considerations such as moral, economic, social and political factors, that may be relevant to the client’s situation.”
We might want to consider that only a human being can fully comprehend things like bias, morality, and social factors, and exercise independent professional judgment. While AI might be useful in speeding up legal research, let’s hope it isn’t used as a guide for the human factors of advising a client in a complex transaction or litigation matter.
-
DUTY OF CANDOR WITH THE JUDICIARY (RULE 3.3)
“A lawyer shall not knowingly: make a false statement of material fact or law to a tribunal or fail to correct a false statement of material fact or law previously made to the tribunal by the lawyer…”
As discussed above, if a mistake is made in citation to legal authority, we have a duty to correct that misapprehension of the law to the judge. Negligent misuse or over-reliance on ChatGPT can continue to put lawyers in the uncomfortable position of owning up to these mistakes in court, to their clients, and to the public. One-off mistakes are one thing, but continual degradation of the public and judiciary’s trust in lawyers could be a serious problem in the administration of justice and conception of law in this country.
CONCLUSION
“LAWYERS PLAY A VITAL ROLE IN THE PRESERVATION OF SOCIETY. THE FULFILLMENT OF THIS ROLE REQUIRES AN UNDERSTANDING BY LAWYERS OF THEIR RELATIONSHIP TO OUR LEGAL SYSTEM.” COLO. R.P.C. PREAMBLE.
Humans are imperfect. Sometimes we miss a case or statute, it happens, but isn’t it worse to let AI subtly create new social and legal concepts and made-up case authority? I’m sure AI will improve over time and become more reliable, but regardless, we cannot carelessly cede professional responsibility to AI in hopes it isn’t lying to us. I guess I’m old fashioned. Maybe I’ll download Word Perfect and go buy that fax machine I’ve always wanted.
For more information, give us a call at 303-268-2867 or complete a consultation request form.
DISCLAIMER
1) This post cites to internet articles that appear critical of fellow lawyers. I did not have time to independently verify everything said by the legal reporters by purchasing the case files and reviewing them. We do not condone a guilty-by-internet approach to legal ethics investigations. Do your own research if interested in those cases.
2) The information contained on this website is provided for informational purposes only. It is not legal advice and should not be construed as providing legal advice on any subject matter. Laws frequently change and therefore this content is not necessarily up to date, nor comprehensive. Contact us or another attorney with any legal questions specific to your matter. You may request a consultation by completing our consultation request form.
[1] https://news.bloomberglaw.com/business-and-practice/lawyers-ai-blunder-shows-perils-of-chatgpt-in-early-days
[6] https://www.reuters.com/legal/new-york-lawyers-sanctioned-using-fake-chatgpt-cases-legal-brief-2023-06-22/
[7] https://www.cbsnews.com/news/texas-judge-bans-chatgpt-court-filing/
[10] https://www.lawweekcolorado.com/article/colorado-lawyer-cited-fake-cases-in-motion-written-with-chatgpt/
[12] https://www.1news.co.nz/2023/04/25/dangerous-some-nz-lawyers-hoodwinked-by-fictitious-chatgpt-cases/