HCBA Lawyer Magazine No. 34, Issue 5 | Page 56

hAveyouMetMynewASSociAte ?
Trial & litigation Section Chair : ­Jason­Whittemore­ – Wagner­McLaughlin­Whittemore
Areyouusing chatgpttowrite yourbriefsyet ?

Generative AI is computer software that creates something , such as drawing a picture , fabricating a photo , or drafting a legal brief . In most contexts today , we have access to generative AI in the form of chat interfaces , with ChatGPT currently the most well-known . These AI chat interfaces operate by anticipating the most likely correct next word or sequence of words in response to a user-input prompt . There are variations on the theme , but that ’ s the basic model . And , for example , ChatGPT will generate a legal brief if prompted to do so .

Generative AI in the form of chat interfaces is a powerful tool that can “ dramatically improve the efficiency of a lawyer ’ s practice .” However , while our faith in technology is generally rewarded , a little skepticism is advisable when adopting a new technology as radical as the latest version of artificial intelligence in the practice of law .
AI chat bots can hallucinate , if you didn ’ t know . In AI jargon , hallucinating means the chatbot is fabricating things from whole cloth . This isn ’ t so bad if it ’ s writing a children ’ s fairy tale . But when you ’ ve asked ChatGPT to draft your brief to the 6th DCA , hallucinations can get you in a lot
of trouble . As several lawyers and litigants around the world have already learned , using computer software to automatically write your legal briefs may not be so efficient when you include the time to respond to show cause and sanctions orders .
Disgraced former lawyer Michael Cohen got his lawyer in hot water when he helped draft the legal briefs requesting early termination of his supervised release . Apparently , Cohen sought the assistance of Google ’ s version of AI chat , called Bard , to ease his effort . Bard , however , hallucinated at least three cases into existence and relied on those fake cases in the brief . Cohen dutifully passed the brief along to his lawyer , who didn ’ t check the citations . U . S . District Judge Jesse M . Furman did check , and he entered an Order to Show Cause stating , “ As far as the Court can tell , none of these cases exist .” This left Cohen and his lawyer scrambling to explain themselves . They ’ re not the only ones .
In another case , Mata v . Avianca , Inc ., a New York state court personal injury attorney was out of his depth arguing a federal bankruptcy and statute of limitations tolling issue .
For research and drafting help , he turned to ChatGPT and only edited for “ flow .” He later testified he knew so little about ChatGPT that he was unaware it could ever be wrong . U . S . District Judge Kevin Castel had a different view , finding that the attorneys at the firm “ abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations ....” The lawyers were sanctioned $ 5,000 , which seems altogether mild . There are a handful of other published orders in the last year addressing similar conduct by attorneys who filed briefs citing AI hallucinations .
The Florida Bar recently issued a non-binding ethics advisory opinion ( 24-1 ) on the use of generative AI , which provides a good start for analysis . 1 And if you decide to give ChatGPT a try , just don ’ t forget to
Shepardize . n
1 https :// www . floridabar . org / etopinions / opinion-24-1 /.
Author : Morgan W . Streetman - Streetman Law
Join the Trial & Litigation Section at hillsbar . com .
5 4 M A y - J u N 2 0 2 4 | H C B A L A W y E R