Couldn’t help myself. I encountered a tweet about a “robot lawyer” and took the bait. I’m a moron.
An unwise decision. Silliness promptly followed. To preview, Robot Lawyer LISA is just another document assembly tool with a single mediocre form (an NDA). For what it is—consumer-facing doc assembly—the concept and content are fine relative to what else is available. The claims to be something more—an AI robot lawyer—are absurd. The hyperbole, however, is effective (here I am writing about it like a sucker) and unsurprising given that we (hopefully) just passed the peak of another hype cycle.
As document assembly tools go, the Robot Lawyer LISA’s UI and UX are reasonably slick if slightly buggy.* It’s built on Neota Logic, a platform I like. With respect to the content, I outsource the analysis to the incomparable Mr. Contract, Ken Adams (see my previous post on Ken and online legal forms):
I had a look at the fruits of your dalliance with LISA the AI Lawyer. The best I can say is that someone who fraternizes with LISA might well end up with something more suitable than if they had grabbed an NDA at random from the great online junkyard.
The questionnaire offered is basic. The annotations offered are rudimentary. The guidance mostly comes in the form of an AI-free PDF. But that’s probably OK—LISA is aiming for the unsophisticated end of the market.
The language in the output document is Clunky Traditional, English Division. I could write a book about its shortcomings. In fact, I already have. That said, it would be delusional of me to fault the language because it doesn’t comply with my guidelines, given that my guidelines are still, uh, pioneering, particularly in England.
In the two minutes—really—I devoted to substance, I spotted two issues. A recital refers to information that might be “confidential or proprietary in nature.” The word proprietary doesn’t make sense in this context, as I discussed in this 2010 blog post. It’s an unnecessary mistake for LISA to make, given that the word doesn’t occur in the body of the contract. But it doesn’t bode particularly well.
And I noticed this sentence in the PDF: “The main reason and benefit of using a deed rather than a simple agreement is that confidential documents or information provided BEFORE the NDA is signed will be covered by the deed.” That strikes me as debatable: if as part of getting more confidential information I agree to keep confidential any information disclosed previously, my promise is supported by consideration without my having to resort to the magic-words contrivance of describing the contract as a deed.
Further rooting around would likely raise further issues. That said, the substance is likely treated no worse than it is in the mass of stuff out there.
I realize I’m setting the bar low, but we’re dealing with business contracts, where dysfunction is the norm, so you can set the bar low and conceivably still be useful. But don’t expect me to applaud. Given the brave-new-tech-world trappings, I would have expected something a bit more ambitious, in terms of technology and content, from LISA the AI Lawyer.
I didn’t share Ken’s expectations. Vain hope, sure. But no room for genuine disappointment. Grandiose claims about “robot lawyers” put my BS detector on high alert. “Artificially intelligent”, “robot”, and “lawyer” are vague terms that continue to be stripped of meaning by overuse. Robot Lawyer LISA takes this vacuousness to new heights.
AI is a broad field. I’m comfortable with expert system platforms like Neota Logic being considered a form of AI. I don’t have the chops to argue otherwise. Still, I doubt it conforms to what most people today think of when they hear the term “artificial intelligence.” We constantly move that goal post: “It’s only AI when you don’t know how it works; once it works, it’s just software.“
At a recent conference, I presented with co-Geek Ryan McClead, a VP at Neota, who recounted many debates about whether his product qualifies as AI. His killer rejoinder (paraphrasing) is that it doesn’t matter if the technology conforms to someone’s subjective definition of AI, what matters is whether it solves a real problem better than what is currently available. Hear, hear!
The reason I had to try Robot Lawyer LISA for myself is because I could not elicit a coherent answer on what made it superior to the available alternatives. Like Ken, I found it, at best, comparable to what has been around for years. If Robot Lawyer LISA is AI, so are all the other consumer-facing dynamic document assembly platforms. Which is to say the AI label is not a useful distinguishing factor.
As the person behind the account surely knows, the definitions of “robot” are broad. Most people probably think of these:
But there are software robots. So I guess, technically, we can take the broadest definition and call any form of software automation a “robot”, just as we can call it all “AI.” This, however, makes AI robot both redundant and virtually meaningless as a descriptor.
There seems to be no statutory definition of “lawyer” in the UK (happy to be corrected on that). Yet Robot Lawyer LISA does not satisfy any of the plausible candidates I located:
From The Law Society:
Lawyer – a member of one of the following professions, entitled to practise as such:
the profession of solicitor, barrister or advocate of the UK
- a profession whose members are authorised to carry on legal activities by an approved regulator other than the Solicitors Regulation Authority (SRA)
- an Establishment Directive profession other than a UK profession
- a legal profession which has been approved by the SRA for the purpose of recognised bodies in England and Wales, and
- any other regulated legal profession specified by the SRA for the purpose of this definition.
From Slater and Gordon:
The term Lawyer is a generic term used to describe anyone who is a Licensed Legal Practitioner qualified to give legal advice in one or more areas of law.
From Oxford Dictionaries:
A person who practises or studies law, especially (in the UK) a solicitor or a barrister or (in the US) an attorney.
Best I can tell, Robot Lawyer LISA is not a member of any profession, not entitled to practise, not licensed, not qualified to give legal advice, and not a person, let alone a person who practises law. Indeed, the site delivers the disclaimers you would expect from an ordinary online document assembly service:
this App is made available to you strictly on an ‘as-is’ basis and we give you no warranty, guarantee or assurance of any kind about this App.
In particular, the information provided may be incorrect or out of date, and may not constitute a definitive or complete statement of the law or practice in any area and the output of the App may not be suited to your particular purpose.
The information provided is not intended as, and does not constitute, legal advice in respect of any specific situation or for any particular purpose. You should take your own legal advice in respect of specific situations and conduct your own research into the suitability of lawyers before appointing them.
So Robot Lawyer LISA does not give legal advice. Instead, it counsels lay consumers to take their own legal advice (huh?) and do their own research before appointing a lawyer (and here I thought I already had a robot lawyer in LISA). Weak sauce.
I am sure someone from the Robot Lawyer LISA team can point me to a nebulous definition of “lawyer” that encompasses their app. But that will only prove that words have no meaning, and we are all living in the fever dream of a stoned college sophomore who is encountering Wittgenstein for the first time.
Whether there is some tortured, technical sense in which LISA can be called an AI Robot Lawyer is irrelevant (to me). What matters (to me) is that labeling LISA an AI Robot Lawyer does not convey any useful information to the consuming public.
This prompts two questions that share an answer:
1. Why do damage to the English language in order to call LISA an AI Robot Lawyer?
2. Why do I care?
Because it works. At least in the short term. These days, it would be hard to garner press coverage for launching yet another doc assembly app for basic contracts. You won’t be invited to keynote any conferences for providing lay consumers a single mediocre form to fill out online. But put “AI” or “robot” in the press release, and the near-term coverage will be considerable. So you should probably use both. And it isn’t just chumps like me who read everything. It is the headline skimmers in positions of power.
I run into too many legal operations folks in corporations and firms suffering from hype fatigue. They never bought into the hype themselves. But their superiors see so many article about AI and robot lawyers that orders come down from on high to investigate this promising new frontier. The superiors are expecting robot magic. The operations folks come back with a smattering of point solutions, most of which are useful, but none of which live up to the hype. This exercise in chasing shiny objects wastes everyone’s time, including the providers, who actually have worthwhile, if narrow, products to offer.
Likewise, I’ve endured too many god awful keynotes where people run through some back issues of Wired and then conclude with “and it’s coming to law”, as in:
Watson won Jeopardy! And now he’s not only curing cancer but also making gourmet meals. yada yada yada. Moore’s law. yada yada yada. Alexa. Siri. yada yada yada. Augmented reality. People are controlling drones with their brains. yada yada yada. Blockchain. IoT. 3D printing. Quantified Self. Chatbots. Self-driving cars. Machine learning. Quantum computing. Cold fusion. Red mercury. yada yada yada. And it’s coming to law.
Despite my deep annoyance at Robot Lawyer LISA and my even deeper disappointment in myself for taking the bait, this is where I have to give the usual caveat that it is not all fluff. Document assembly and automation are still, decades later, underutilized in the legal space. Consumer-facing forms fill a genuine void. Neota Logic is a great platform that underpins all sorts of interesting offerings (e.g., the Akerman Data Center). There are other solid companies out there using various forms of AI to introduce needed tools to the legal market. And some of the best keynotes I’ve ever had the pleasure of attending have AI as a core theme (e.g., I witnessed Dan Katz’s phenomenal ILTA keynote live and then watched it again online).
There is real innovation happening in legal that is well worth paying attention to. But this ain’t it. This just adds to the deafening cacophony of hype-driven noise. Yet I can’t blame the folks running Robot Lawyer LISA. Start-ups are hard. The legal market is an especially tough nut to crack. They found a way to get noticed. That their marketing annoys a curmudgeon like me is, for them, probably just an added bonus. Ultimately, I have to score this round for Robot Lawyer LISA because I just wrote a long post about a vanilla doc assembly offering with only one form.
*Robot Lawyer LISA’s UI and UX are solid. But, despite selecting the United States as my jurisdiction, I could not move forward without a “Company Number.” I’m assuming this refers to a CRN. The U.S. has no meaningful analogue (I’m not going to put my EIN in an NDA). My text response—”don’t have one”—made it into the assembled document.
In addition, I was required to provide a backup email for me and my counterparty. It is rare that I have a second email address for someone I am just starting to do business with.
I also did not see any esignature functionality, which, to me, is a key feature in the contract space.
Finally, I should probably mention that Robot Lawyer LISA’s other differentiator is supposed to be impartiality. Instead of guiding only the author through the document assembly process, the counterparty can opt to walk through the same guided process. I’m unmoved. I don’t know if its novel in this space. I’ve definitely encountered bilateral contract collaboration platforms on the corporate side. But maybe it really is some sort of gamechanger that I am too jaded to appreciate. If it ever gets to the place where it can help creatively resolve disagreements about contractual content (e.g., combining Ken’s insights on contract language with, say, the choiceboxing techniques of Marc Lauritsen, who happens to also be the godfather of legal document assembly), then I will revise my opinion and apologize for my rank cynicism.