Share on Facebook
Share on X
Share on LinkedIn

by Andrew Flake

You may have read that, at the close of the year, artificial intelligence (AI) hit the mainstream. Open AI released a tool, ChatGPT, with some remarkable capabilities. Among other things, it can generate poetry, essays, conduct research, and manipulate vast amounts of data and millions upon millions of examples of human creation into new forms.

What do ChatGPT, and similar applications that will undoubtedly follow, portend for the practice of law? What does it portend for litigation? For ADR?

It’s too early to say, although looking at disruptive technologies of the relatively recent past, we could safely predict it will be a mixed bag, with the extreme polarities less likely. As with other, earlier forms of technology in the law, I don’t see the most dire predictions — having to do with the scenarios in which clients will not need lawyers — coming to pass.

Instead, I would expect the practice to evolve, like the rotary phone to the cellular phone, and not to disappear, like the VHS cassettes and DVDs that preceded streaming. Keep your bar membership active. Some fundamental aspects of what we do that make me confident in that prediction include:

  1. We are experiential strategists, drawing on the experience of others and our own experience to make complex judgment calls, doing so in the context of a hundred variables in every case, from interactions with other counsel to the emotions of the client.
  2. We are empathetic listeners, building bonds with clients even as we assimilate information.
  3. We are creators, drawing together information in surprising and unexpected ways, a process that is infinitely complex and sometimes, a bit mysterious.

Situational judgment and strategy, empathy, and creativity are all fundamentally human, and do not translate into algorithmic or emulative expression.

Will we be relying more on software in the coming years for increasingly sophisticated case analysis and even prediction? Yes, and those changes have been coming for some time. They may not be as dramatic and the pandemic-induced adoption of virtual proceedings, but they do feel inevitable.

Take, as an example, the process of document review: Remembering almost 25 years ago, sitting in a conference room for five days with a fellow associate and a pot of coffee, poring through dozens of banker boxes of documents, makes me smile. Today, that same process can be accomplished in a few hours. In the same way, where the task involves the manipulation, review, or assessment of vast amounts of existing data, automation and AI just make more sense.

Still, it takes judgment to determine how and when to use those tools, to understand their limitations, and to curb their excesses. “Code is law,” as Lawrence Lessig famously said, meaning that the rules that structure legal technology can be the rules that determine legal outcomes. It is the lawyer as human advisor, imbued with the sense of professionalism and fairness, of due process and of ethics, who can best assess those rules. It is to us that protecting our freedoms and legal values is rightly reserved.

It takes good listening to build the trust with clients that lets us understand their worldview, their problems, and how they are best resolved. And creativity — well, I can already tell, after experimenting with some of the AI-generated text for a time, that there is a certain sameness to much of it. Putting aside the question of authenticity — whether a poem crafted by AI about a mountain has the same truth as a poem written by a hiker who has climbed it — the AI text is missing a certain spark and spontaneity.

As impressive as their capabilities are and will undoubtedly become, ChatGPT and like applications are still vehicles requiring drivers. Setting strategy, listening and relationship-building, and creativity are fundamentally not programmable. Nor, I’d submit, do we want them to be. Happy New Year!