(l-r) Steve Harmon, vice president and deputy general counsel of Cisco, Mary O’Carroll, head of legal operations at Google, and Sylvie Stulic, manager of legal operations and litigation at Electronic Arts. (Photo: Zach Warren/ALM)


The opening keynote of the Corporate Legal Operations Consortium’s (CLOC) 2017 institute showed a live demo of various technologies using Amazon .com, Inc.’s Alexa. But while the automated file and contract searches elicited “oohs” and “ahhs” from the assembled crowd, one question remained: How exactly does artificial intelligence (AI) technology apply practically in the legal department?

One of the very first sessions of the conference, “Practical Applications of AI in Today’s Law Department,” attempted to answer that question. The panelists were three corporate legal operations experts giving their own experiences: Mary Shen O’Carroll, head of legal operations, technology and strategy, Google; Steve Harmon, vice president and deputy general counsel, Cisco; and Sylvie Stulic, manager of legal operations and litigation, Electronic Arts Inc. Paul Lippe, former GC of Synopsys Inc. and currently with Elevate, moderated the panel.

What’s the Problem?

It starts with the fact that AI isn’t particularly well-defined, Lippe noted in his opening remarks. For many, there are three main descriptions of AI in legal departments:

  • An ill-defined thing that describes some future technology that will do something in the future.
  • A general term for using computing power and data to solve specific legal problems.
  • Technology that defines the boundary between what humans and machines are doing.

And what’s the correct answer? According to Harmon, all of them.

“AI is subject to a number of different opinions at this point in time, and all of them are correct … until we get some standards,” he explained.

While those standards are coming sooner rather than later, especially as lawyers begin to see AI’s practical uses, there remains the issue of actually getting people to adopt that technology.

“Part of this is entry strategy. Part of what we need to do as operations professionals is get people to adopt this technology, and that technology may be disruptive to their jobs,” he explained.

Shen O’Carroll added that many of the people at CLOC 2016 she talked with last year left with the impression that while AI technology will be relevant in 10 to 20 years, the time to adopt is now.

“I don’t think you need to be super smart in AI to know the practical applications of how to make it happen,” she said.

How’s It Being Fixed?

While some think AI could completely displaces lawyers, the panelists all agreed that practical AI requires three pieces to work: people, process and technology. In this case, the three pieces are experts who define the solution and help oversee its implementation; the service model that delivers the work; and AI that delivers information tracking, workflow and reports to improve efficiency.

But working that may require a different way of looking at law. “We write [contracts] like Jane Austen novels, but we need to interpret them like they’re data,” Lippe said.

To make that change, Harmon said, “You have to address the adoption challenge first, and the way you can do that is make lawyers think they’re more efficient than when they first started.”

For one practical example, Shen O’Carroll noted a way that Google recently utilized AI: a self-help expert system to cut down on emails to and from legal’s help desk for clients. The system was disruptive for attorneys who wanted to have ownership over their clients and questions, but once they realized how much time they were saving not answering simple emails, they quickly bought in.

“Clients are getting responses faster, emails have been cut down, and satisfaction is way higher,” she said.

That doesn’t mean that everything will go off without a hitch. Stulic told the story of a technology system automatically sending a save-the-date calendar invitation from her desk, without prompts from her or anyone she worked with. She still doesn’t know how it happened, but speculated the email was automatically sent from a conversation the technology overheard in her office.

“The mystery of AI looms, because the technology is still in its infancy,” she said. As a result, legal departments need to anticipate and work through those inevitable hiccups.

So Where Do We Go From Here?

The panel said that any legal department looking at AI technology needs to ask three main questions:

  • How is the way we are doing work aligned with what really matters to the internal client and other stakeholders?
  • If we use AI, what aspects of the work can the AI improve, and what do we risk making worse? How do we measure the difference?
  • How can we learn from every stage of our use of a new tool to continue to improve our work?

Stulic said to look at where the department can become more efficient before making any actions.

“[Attorneys] don’t have the time to process NDAs, or answer those emails,” she said. “It’s a matter of looking and seeing what you can cut first.”

Harmon added that part of the process of answering these questions is adopting and molding emerging technologies to what the legal department requires. In analyzing contracts in years past, Cisco’s legal department would “point our e-discovery tool at our contracts and ask it questions. Pretend like we were suing ourselves.” Now, though, his department has done a thorough assessment of what technologies actually make sense for the department’s attorneys. As he put it, “We can all buy Rory McIlroy’s golf clubs, we can’t buy his game.”

And the time to do this sort of assessment is now, the panel all agreed. “We don’t have the resources waiting for the next fire to happen,” Harmon added, saying that the CLOC has started to work together to standardize data sets to better prepare for these fires. “We’re from technology companies; let’s solve the problem for our vendors.”

Contact Zach Warren at ZWarren@almg.alm.com.