My Two Cents: If you are going to use ChatGTP and its cousins to write a brief, Shepardize!!!

AI does not replaced doing your homework! Shepardize!!!

An attorney in New York learned the hard way that ChatGPT is not a reliable source.  A lawyer representing a man in a lawsuit against an airline used an artificial intelligence (AI) program, ChatGPT, to assist in preparing a court filing. However, the AI-generated content turned out to be entirely fabricated. The lawyer cited nonexistent court decisions and quotations in his brief, which were not found by either the airline's lawyers or the judge. The lawyer admitted to using ChatGPT for legal research and claimed he was unaware of the program's potential for providing false information. The judge ordered a hearing to discuss potential sanctions. The incident highlights the debate among lawyers regarding the use of AI software and the need to verify information provided by such programs.

Chatgpt has been known to not only be wrong at times but also make up stuff!

I look at it this way: If your new clerk handed you their first draft, you would double-check the work and likely Shepardize the citations; I don’t think I have to preach that Shepardizing cases before filing a brief is usually the rule of thumb. Rule 1.1[8] requires attorneys to keep a reasonable understanding of the technology we use and how to use it. This inherently includes knowing technology's limitations and flaws. Something the NY attorney conceded he did not do with his use of ChatGTP.

Know the aba model rules and your state bar rules of ethics!

Rule 1.1 [1, 4 & 5] requires an attorney to act with competence. In this case, I have a feeling Mr. Schwartz did not follow this rule - he did not check his case law. I have some empathy for Mr. Schwartz.  But I also have a feeling the bar will not feel the same way.       

Happy Lawyering!!!

MTC.