Word of the Week: Hallucinations (in the context of Artificial Intelligence, Machine Learning, and Natural Language Processing)?

The term "hallucination" refers to a phenomenon where an AI model generates or interprets information not grounded in its input data. Simply put, the AI is making stuff up. This can occur in various forms across different AI applications:

Remember just like you can鈥檛 complain to the judge when your clerk makes a factual or legal error in your brief, you can鈥檛 blame ai for its errors and hallucinations! 馃槷

Text Generation: In NLP, hallucination is often observed in language models like ChatGPT. Here, the model might generate coherent and fluent text, but this text is factually incorrect or unrelated to the input prompt. For instance, if asked about historical events, the model might 'hallucinate' plausible but untrue details. Another example is when attorneys rely on ChatGTP to draft pleadings only to learn the hard way that its cited cases do not exist. (Remember, always check your work!)

Image and Speech Recognition: In these areas, AI hallucination can occur when a model recognizes objects, shapes, or words in data where they do not actually exist. For example, an image recognition system might incorrectly identify an object in a blurry image, or a speech recognition system might transcribe words that were not actually spoken.

I鈥檒l spare you a deep, complex discussion of the problems with AI in this context.  But the three takeaways for attorneys are: 1. The programming for AI is not ready to write briefs for you without review, 2. Attorneys are not being replaced by attorneys (but attorneys who do not know how to use AI in their practice correctly will be replaced), and 3. Always check your work!

Happy Lawyering!

#73: Legal Research and More, with Sarah Glassmeyer

Our next guest is law libriarian Sarah Glassmeyer. She has a career that includes academia, nonprofit tech, and even a fellowship at Harvard. Her numerous awards, including being named to Fastcase 50 and as an ABA Legal Rebel, speak to her impact. Sarah's commitment to learning and growing and her passion for her mission ensure she'll never stop striving for positive change in the legal world.

Join Sarah and me as we discuss the following three questions and more!

  1. What are the top three tech tools utilized by larger law firms that solos and small law firms would be surprised are reasonably accessible to them?

  2. What are the top three ways Chat GPT falls short for attorneys?

  3. What are the top three directions that you see technology heading in that attorneys should keep an eye on?

In our conversation, we cover the following:

[01:08] Balancing Platforms: Sarah鈥檚 Hybrid Tech Ecosystem

[10:13] Tech Tools for Smaller Firms to Rival the Big Players

[23:38] Three Ways ChatGPT Falls Short for Attorneys

[37:07] Key Technological Trends for Attorneys to Monitor

[45:12] Where to Connect with Sarah

Resources:

Connect with Sarah:

LinkedIn: linkedin.com/in/sglassmeyer
Website: sarahglassmeyer.com/
Substack: substack.com/@sarahglassmeyer

Hardware mentioned in the conversation:

ThinkPad: lenovo.com/us/en/c/laptops/thinkpad/

Software and Cloud Services mentioned in the conversation:

Substack: substack.com/@sarahglassmeyer
FatCow: bluehost.com/fatcow
Azure: azure.microsoft.com/en-us
AWS: aws.amazon.com/

My Two Cents: If you are going to use ChatGTP and its cousins to write a brief, Shepardize!!!

AI does not replaced doing your homework! Shepardize!!!

An attorney in New York learned the hard way that ChatGPT is not a reliable source.  A lawyer representing a man in a lawsuit against an airline used an artificial intelligence (AI) program, ChatGPT, to assist in preparing a court filing. However, the AI-generated content turned out to be entirely fabricated. The lawyer cited nonexistent court decisions and quotations in his brief, which were not found by either the airline's lawyers or the judge. The lawyer admitted to using ChatGPT for legal research and claimed he was unaware of the program's potential for providing false information. The judge ordered a hearing to discuss potential sanctions. The incident highlights the debate among lawyers regarding the use of AI software and the need to verify information provided by such programs.

Chatgpt has been known to not only be wrong at times but also make up stuff!

I look at it this way: If your new clerk handed you their first draft, you would double-check the work and likely Shepardize the citations; I don鈥檛 think I have to preach that Shepardizing cases before filing a brief is usually the rule of thumb. Rule 1.1[8] requires attorneys to keep a reasonable understanding of the technology we use and how to use it. This inherently includes knowing technology's limitations and flaws. Something the NY attorney conceded he did not do with his use of ChatGTP.

Know the aba model rules and your state bar rules of ethics!

Rule 1.1 [1, 4 & 5] requires an attorney to act with competence. In this case, I have a feeling Mr. Schwartz did not follow this rule - he did not check his case law. I have some empathy for Mr. Schwartz.  But I also have a feeling the bar will not feel the same way.       

Happy Lawyering!!!

MTC.

My Two Cents: What is DALL路E 2 and How Can Lawyers Use It!

DALL路E 2 can help supplement the creative skills attorneys may lack when it comes to creating visual concepts. DALL路E 2 is an artificial intelligence model developed by OpenAI. It is a variation of the GPT-3 language model trained to generate images from textual descriptions. DALL路E 2 can generate original, high-quality images by interpreting and synthesizing textual prompts.

Lawyers, like professionals in various fields, can find several reasons to use DALL路E 2 in their work. Here are the top five reasons lawyers might consider using DALL路E 2:

  1. Visual Representation: DALL路E 2 can generate visual representations of legal concepts, scenarios, or evidence described in text. This can be particularly useful in courtroom presentations, client meetings, or legal documentation, where visual aids can enhance understanding and communication.

  2. Depicting Scenarios: Lawyers often need to convey specific situations or scenarios to clients, judges, or juries. DALL路E 2 can help in creating visual representations of these scenarios, making them more relatable and easier to comprehend.

  3. Conceptualizing Ideas: Sometimes, legal concepts can be complex and challenging to grasp. DALL路E 2 can assist lawyers in creating visual metaphors or illustrations to simplify abstract ideas and make them more accessible to others involved in a case.

  4. Creative Visual Content: Lawyers may require engaging visual content for marketing, presentations, or educational materials. DALL路E 2 can generate unique and customized images to create visually appealing and informative content, helping lawyers stand out and effectively convey their messages.

  5. Designing Infographics and Charts: Lawyers often use infographics and charts to present data, statistics, or comparisons. DALL路E 2 can aid in generating visually compelling infographics and charts, allowing lawyers to communicate information visually and improve the overall impact of their presentations.

It is important to note that while DALL路E 2 can provide valuable visual outputs, it is still an AI model and may not always accurately reflect real-world legal scenarios. (The photos on the carousel to the right are the results when I asked DALL路E 2 to generate a blog post picture discussing DALL路E. The results were disappointing.). It should be used as a tool to support legal work rather than a substitute for legal expertise and professional judgment.

MTC