In the second of a special series of interviews from Legal Week 2024 , co-hosts Greg Lambert and Marlene Gebauer welcomed Mollie Nichols, CEO, and Mark Noel, Chief Information and Technology Officer of Redgrave Data. Nichols and Noel discuss Redgrave Data’s mission to cut through the hype of legal tech innovations, particularly generative AI. Nichols emphasized the company’s focus on delivering custom solutions that meet clients’ actual needs and highlighted the importance of educating the legal community on effectively integrating new technologies into their practices.
Mark Noel emphasized the strategic addition of data scientists to their team, enabling Redgrave Data to develop and advise on cutting-edge technologies. He stressed the importance of applying generative AI judiciously, pointing out its limitations and the potential for misuse if not properly vetted. Noel and Nichols shared insights on navigating the legal tech landscape, emphasizing efficiency, data management, and the careful evaluation of tech solutions.
Looking forward, Noel predicted a recalibration of expectations for generative AI in the legal industry, suggesting a period of disillusionment might follow the initial hype. Conversely, Nichols expressed optimism about the industry’s ability to thoughtfully incorporate new technologies, enhancing legal practices through careful testing and integration. Their discussion underscored the evolving nature of legal tech and the critical role of strategic implementation in leveraging its benefits.
Twitter: @gebauerm, or @glambert
Threads: @glambertpod or @gebauerm66
Music: Jerry David DeCicca
Marlene Gebauer 0:07
Welcome to The Geek in Review. The podcast focused on innovative and creative ideas in the legal profession. I’m Marlene Gebauer,
Greg Lambert 0:14
And I’m Greg Lambert. Well this is take two because we had we had a party breakout as we were recording the
Marlene Gebauer 0:21
first because it was a flash mob
Greg Lambert 0:26
and we’re still here at legal week 2024 And when it started we are we are getting get it started in here is. But yeah, I won’t say that. But we have a returning guests. Mollie Nichols from Redgrave and she brought about a guest brought a guest with us today. Martin old, also from Redgrave. So Mollie and Mark, thank you very much for waiting and and Mark put on Mark, I just found out that you’re a former cop. Yeah. It showed when you went out there and started going out to the DJ. Well,
Mark Noel 1:03
it wasn’t the DJs fault. But we did have a chat with the ALM staff about putting, you know, a DJ right next to the press room.
Mollie Nichols 1:12
Yeah. made it a little difficult to start this.
Marlene Gebauer 1:17
You know, it’s all good. We’ll look back at this and laugh. It’ll be a good story.
Greg Lambert 1:24
So, Mollie, we had you on, about a year ago, and we’ll talk a little over a year ago. We’ll talk about that. But we’re here in New York and legal week. What brings you and Redgrave the legal week, what what what are you expecting to see? And what do people expect from you? While you’re here?
Mollie Nichols 1:44
Well, we just finished our second year. So we’ve
Marlene Gebauer 1:47
Mollie Nichols 1:48
Thank you. Thank you. Yeah, we launched in January of 2022. And so we finished our second year in existence. And so we are continuing to get our message out about who we are and what we do. And it’s important that we’re a part of the conversation, and that we understand what technology is being talked about. So that we can actually perhaps bring a little different message. A lot of what you see at legal week is you see the shiny objects. And you hear a lot of hype. And we have a bunch of scientists on our staff and technologists who really focus on custom solutions about what our clients actually need, and what will work within their particular environment in their systems and their process. And right now, the big buzz is on generative AI and how can they implement generative AI into their workflow into their day to day into their legal matters? And if they even want to if they need or if they need to, right, and so we’re having those honest discussions with clients or potential clients, because a lot of things are coming up that are, are hype. And so it’s like getting through that message that there’s a lot to really look at, consider questions to ask. And education opportunity is an education opportunity. And so what we’ve done here at legal week, we actually have a room here in the Hilton on concourse g, where we’re having education sessions, and earlier today, and tomorrow, we’re going to have one on Gen AI use and tar workflows. And so it’s understanding how you can implement them, what are the pitfalls? And how much does that actually cost? And is there a cost savings? And I think we’re hearing that there’s probably not at least at this point in time, a
Mark Noel 4:07
lot of folks haven’t done the math on some of these things. And
Mollie Nichols 4:11
the timing. It’s a it’s a process and it takes a good amount of time. So anyway, so we are going through that education piece, and that’s why we’re here now.
Greg Lambert 4:23
Yeah, well, let me just kind of back off and mark or give us kind of a 30,000 foot level of why clients hire Redgrave.
Mark Noel 4:36
So, for background, we added two more scientists to our applied science. Okay, so we had Dave Lewis, who was formerly the head scientist at brainspace. And we added Jeremy Pickens who was formerly an open text in Catalyst. And then lindora Gray, who was one of the AI award innovator winners at relativity. has dinner last night. So we’ve got a lot of what we’ve been doing is, is gearing up to be able to both build and advise on these sorts of new technologies. That’s also included doing some research, you know, including for a vendor that I think I mentioned earlier, who hired us to help do the experimental design for their Gen AI, ai, review and classification offering. So we’ve had a lot of stuff going on like that, you know, in the past year, in addition to clients who come to us because we will tackle problems or solve problems that nobody else does, including a we write code in anger in the middle of a case.
Mollie Nichols 5:45
Mark, I haven’t heard, we normally talk about it, as you know, in the heat of battle, or on the fly, you know, building software,
Greg Lambert 5:54
I was thinking is anger, a new programming language? Like rust.
Mollie Nichols 6:03
I like that.
Mark Noel 6:06
Oh, but seriously, you know, one of the things that our some of our early large clients learned is they could throw random problems at us that no one else was addressing, and often we would be able to solve it overnight. And that’s one of our niches is we tell people bring us the most complicated, ugly case that you have. Because the reason that we staffed up with data scientists and programmers who also overlap as being former attorneys and so forth, is that we’ve got a team that can handle a lot of a lot of problems that other folks just aren’t geared for. Right? That’s our sweet spot. Yeah,
Marlene Gebauer 6:45
that seems to be a theme that sort of that quick turnaround, that kind of concierge white glove type of service is really what clients are looking for.
Greg Lambert 6:57
You mentioned something about a company reaching out and hiring you to do some testing with a generative AI. So and this is just a question that popped off the top of my head. So take it for that. Is there was is there an expectation of things that generative AI can do that people think it can do? And then when you actually go to test it out? It doesn’t quite work that way. And then art? And I would say, are there things that it does do well, that people may not even think about? And you don’t have to give away the secret sauce?
Mark Noel 7:31
Oh, yeah, no, no. But but these are the kinds of questions that clients come to us with, you know, and one of the back to your earlier question, what do we look to get out illegal week here, one of the things that we do is we’ve been walking around the floor, looking at all the vendors offerings and claims and our applied science team has been asking them questions about how they’ve tested it, you know, and how they’ve evaluated this stuff.
Marlene Gebauer 7:59
We run into they telling you
Greg Lambert 8:03
all right, we’ll leave that part out.
Mark Noel 8:06
Well, I mean, we will also, you know, delve into the property, trust, but verify, we’ll, we’ll check it out ourselves. But one of the things that we want to understand is what claims are being thrown at our clients, you know, and then how do we advise them on what questions they should be asking or what evaluations they should be performing before they adopt any of these things? And yes, there are, I think the the, we may have talked about some of the Gen AI ai pitfalls last year. But remember, they’re Large Language Models, they’re not large legal reasoning models, they’re not large, factual models. They don’t do multi step reasoning. But because they generate their job is to generate plausible looking language. And because it looks plausible, we tend to think that, or we tend to assume that there is a consciousness like ours behind that trying to communicate to us, but that’s not how it’s working at all. And so that’s one of the reasons that, that the hallucinations are a problem in all of the transformer models, and sometimes if you need something to be factually correct, it’s going to take more effort to fix the Add draft than it would be for an expert to start from scratch. Oh,
Marlene Gebauer 9:33
yeah. So um, you know, following on the sort of education idea, last time you are here, you know, we talked about Gen AI ai and you know, how it was gonna hit the industry. And, you know, you were kind of afraid that people wouldn’t, you know, judges lawyers wouldn’t accept it. When it came to, you know, these Large Language Models now, clearly they are there. They are embracing that, but I’m wondering During, you know, where do you think they are in their in their understanding of it? Like I think, again, we’ve we’ve seen on personal and professional lovers levels, how Gen AI works. And I think that’s sort of where the buying comes in. But how much do they really understand at this point?
Mollie Nichols 10:18
I don’t know that there’s a very good understanding whatsoever. We see reactions on, we see overreactions to, to it negative negatively, by judges entering orders saying that you can’t use artificial intelligence or you have to notify the court when you are use or opposing counsel, when you are using artificial intelligence, and you just have to breathe deep when you see something like that. And what did they mean by artificial intelligence? You know, and well, and it could mean almost anything that we use today has artificial intelligence and
Marlene Gebauer 11:01
using artificial intelligence for years and years. We have Gen AI. Yes.
Mollie Nichols 11:07
And so we received some outside counsel guidelines recently that said, no artificial intelligence, and it’s like, Wait, that’s what we’re in the business of doing. So, you know, it’s those are the types of things where you see the overreaction to the use of generative AI and the neck or the negative reaction to it. And then on the positive reaction, it’s like, oh, this can do my job. And let me have it, draft everything. And, and then then the problems that result from that when you’re seeing briefs filed with cases that don’t exist, and things like that. So, you know, we have to look at that balance and have that education piece so that lawyers understand how they can fit it into their practice, because it will be a valuable piece of their practice. But it’s really drilling into understanding how it fits in doing the appropriate testing, and then doing the validation of what they’re they’re doing in order to understand how it will work. I know we’re doing a lot of testing, putting Gen AI, ai and a tar workflow. And if you just change your workflow to all Gen AI, you’re going to have some huge problems with that. But if you use your standard workflow and start picking certain pieces of that, and use Gen AI, then it’s reasonable. So that you can make tweaks to that process without having this disaster and be that party that gets written up in, you know, the next edition of
Mark Noel 13:00
well, the result may not even be a disaster, except for your checkbook. Right? Because if you we were talking about this earlier today, if you had a document corpus of 500,000 documents, if you wanted to use a generative AI engine and have it classified those documents, say you’re going to be charged by the token, which is like three quarters of a word, figure on your pay 10 cents per doc in API charges to do one pass to do one ranking or classification. So that’s $50,000 for your half million documents, not including what you’re going to pay the lawyers to come up with the prompts and do QC and anything else like that. With a standard TAR process, you’d be looking at 10 or $12,000 of attorney time for the review. And the older car engines will rank in five minutes. The Gen AI will classify about 20 Docs per minute. For so it will take two and a half weeks to get through your your half million dot corpus. So there are a lot of times just using the stuff blindly does not make sense from a time perspective from a cost perspective. But as Mollie said, there are ways that we can use it more cleverly to say find initial training documents for a more traditional tar engine that will then classify the bulk of your documents a lot more quickly. But without as much human review upfront for the training,
Marlene Gebauer 14:41
discussions, kind of an opportunity to refocus clients on other technology that will actually solve the problem.
Mollie Nichols 14:50
Correct? Correct. And then when you talk about the cost in particular, you look at tools in their toolkit that they are already have, where it won’t cost them anything where if they’re using, you know, Gen AI, it’s going to cost quite a bit of money. So, and
Greg Lambert 15:13
we kind of we’ve talked about this before, and that there’s kind of this layering of existing technology, AI, generative AI extractive. And so are you, are you kind of coaching your clients on? Okay, let’s look, let’s look at the whole spectrum of tools that we have. And it’s not just one tool to do all problems, but it’s, it’s this blending of tools. And are you finding them receptive to that? Are you what’s kind of what’s kind of the feedback you’re getting from clients?
Mollie Nichols 15:50
Yes, we’re finding they want to use the tools they have. And so they like the message that we’re not trying to sell them a shiny object, something new to bring in to be that sort of silver bullet. But to find whatever this AI that they want to bring in, to see what would fit within what they have to fill that gap to help with whatever automation that they want to have. And make sure that it integrates with the rest of their technology stack.
Mark Noel 16:22
And a lot of times this, the best solution is not going to involve generative AI. All right, we’re going to be running automation, we’re going to be writing some glue code to help tie some of their existing systems together, come up with some better workflows and things like that, you know, there are going to be things that generative AI is good at, you know, and then we’re going to find uses like in tar and for summarization, investigations, and so forth. But for writing legal documents, and other things like that, it tends to not be cost effective still,
Greg Lambert 17:00
it does generative AI, because I know one of the things one of the early use cases that I remember hearing gaseous probably back in March, when it came to ediscovery. And we had a guest on earlier that talked about this in that one of the issues that over the past few years is that people have become a little bit more savvy on not just saying everything and you know, writing everything down and sending it out or saying everything but but you know, being very creative in when it’s time to take the conversation offline. And so one of the early use cases that I heard was that people thought that generative AI would help kind of identify when there are changes in the conversation in the in the flow of the information going back and forth. It is that it? Are you seeing any of that?
Mark Noel 18:00
Well, that is a testable question. Right. And one of the things that we like to do with testable questions is obviously test them. But what you’re describing also is what we have been built into what you sometimes heard as Diversity algorithms, which are looking for novelty or outlier or anomaly detection in the language. And a lot of the in as with tar for classification, using some of these algorithms for anomaly detection may be as good or better or almost as good, but 20 times cheaper than generative AI. So this isn’t one of the areas that there’s a lot of also academic research going on that we’re following. But, again, that’s we run into the sort of the same issue there, as we do with tar and classification is that sometimes the 20 year old techniques are way faster, way less expensive, and good enough.
Marlene Gebauer 19:09
I’m wondering, you know, other changes in the industry that you needed to respond to other than than generative AI? Is there anything? Is there anything else besides that?
Mollie Nichols 19:20
Where do we start? Top three? Well, you know, one of the things that we’re finding that that clients want to know, they want to get insight into efficiencies, and they want to know how to measure things. And that’s speaking our language. You know, we have a lot of analytics. It’s a lot of analytics, but it’s understanding what data needs to be collected in order to measure it. And then how is that? How do you analyze that once you’ve collected it, and measure or did. And so we build data visualizations to help our clients. And so we help them identify what needs to be measured, and then help them with a mechanism to be able to have that visualization so that it can help them with decision making. And many times that’s in the discovery space. But it can be in Legal Operations and other areas as well. So we have one client to has us doing this with five different ediscovery providers that they have. And they want to be able to compare them to understand, you know, to be able to look at apples to apples, and to look at the efficiencies and each one of their discovery vendors. And so that’s something that we’re assisting them with. And it’s important, because as we all know, every service provider presents their stuff in a different way, their services, the unit, the way they host data, the way they package things and the way they price it. And so, you know, we it’s not always easy to just kind of just overlay and compare. That’s correct. And, you know, and that’s, and that’s not just the pricing that that we look at, when we’re helping our clients, we have a list of what 75 different KPIs that we have that, you know, we help our clients look at and measure. Do you have something else mark it?
Mark Noel 21:42
Well, I mean, one of the one of the tricks there, as Mollie was alluding to, is have to figure out what metrics or what data will actually answer the question. You know, and that’s, that’s one of the areas where I think a lot of our both experienced practitioners and our applied science team come in, because there are a lot of ways that you could measure, you know, these these results, that would not necessarily be valid or give you good results. So we’re
Mollie Nichols 22:13
also finding that the volume of data, you know, continues to grow. And some of the review tools that are existing today are just too expensive to use. We were approached by a trucking client that had 2 billion short messages from a homegrown system with a
actually two homegrown system to homegrown systems,
not enough. And it also had geolocation data. So it was communications from the trucker back to the dispatch. And so all of that had to be put into a
Mark Noel 23:01
well, it needed to be full text searchable. But we also needed to be able to tie it together with stuff in similar geographic locations. And also similar times. The other fun part is that this data was so large that the producing party can only host three months worth in their own systems that generated but we were our client had 12 years worth produced that all needed to be indexed and available for search in the same corpus, or
Greg Lambert 23:34
are they backing this up on a cloud? Or was it was a physical drive somewhere
Speaker 1 23:39
else? That’s another story because it was the productions were all over the place. And we had to identify several that were corrupted or several that came from wrong sources or had an unusual deficit, like 1/3, the expected number of message traffic, you know, for several months compared to the rest of the years, and then provide, you know, affidavits, you know, for going for motion to compel. So,
Mollie Nichols 24:07
you know, and so we ended up building what I wanted to talk about the build Mark, you’re the one who did it. So if I start looking at Bing, that’s not quite well, right. Well,
Mark Noel 24:22
most of this, so as Mollie was saying earlier, a lot of traditional ediscovery platforms, it would be cost prohibitive. To put that stuff in, we used AWS Cloud Search and a few custom scripts and search interface, so that we could load all of that data, but then spin it down when the client didn’t need to search it. And once it’s spun down, it only costs you know, 160 $170 a month to keep it ready. And then they say hey, we need to start searching again for a deposition shins are a motion practice, we can spin it up in a day or two, and then they use it for as long as they need to, and then it goes back down. So it’s a lot more cost effective than using a traditional ediscovery. Tool. Where if you had that many terabytes of data being hosted, yeah.
Greg Lambert 25:22
Well, that’s very creative. I like how you did that. So.
Marlene Gebauer 25:27
So Mollie, you’ve, you’ve been here before. So you know about the crystal ball question. But the crystal ball question is that we were going to ask both of you is, what do you see is the biggest challenge, you know, the next, you know, two to four years that the legal industry is facing? And, you know, in your space or in general?
Greg Lambert 25:52
Yes, yeah. He’s
Marlene Gebauer 25:53
a newbie he gets to go first.
Mark Noel 25:56
Fair enough. Well, in that case, I may offer you more than one. All right. I think is Gartner that has the hype curve, you know, that cycle? Yeah. So there’s first that peak of unrealistic expectations, and then the trough of disillusionment. And I think we’re about to hit the trough in 2024. Because people as they start to deploy these things will realize that, that Gen AI, isn’t the magic bullet, or it wasn’t the right technology choice for that particular task, or it’s too expensive. Is what we said there, there are a lot of things that it would be suitable for, but you have to be a lot more careful with the choosing and evaluation. And so I think we’re going to hit that trough in 2024. And there’s going to be a bit of a correction. But the other challenge that I see is also something that Mollie mentioned earlier. And that is you see clients, but not just clients, but also courts and government agencies and regulators overreacting, you know, and saying no AI for anything, but they don’t even define what AI means, or whether they’re restricting it to generative AI, ai, or even or, or including 30 year old machine learning techniques that we’re using for everything else. And so dealing with those hurdles, not just from clients, but regulatory and legal hurdles, I think is going to be another area where we’re going to have a lot of work to do.
Greg Lambert 27:30
All right. So he went first. So you get to go next.
Mollie Nichols 27:33
So I’m going to be a little more optimistic. All right. So I think we’ve learned a lot in the past. Day, decade plus with regard to tar. We resisted it. We did everything we could to say it was a black box, we had courts, push it away. I think with generative AI, we’re actually going to look at it and say, How can this help us? And what part of our process? Can this actually make a difference for our clients? And we’re going to look at it, we’re going to educate ourselves, we’re going to test it. And we’re not going to just wholesale adopt it and throw it in. But we’re going to try different things. And then we’re going to enhance our process. And I think it’s going to be this iterative change. Because we have such a thoughtful industry now. And before it was just the wild wild west. And there’s still some of that don’t get me wrong. But we have enough very smart people who have been in this industry long enough to know don’t just throw it at it. Let’s go through a process and let’s start testing it and do this the right way. Excellent.
Greg Lambert 29:01
Well, Mollie Nichols and Mark Canole, thank you very much for taking the time to talk with us here at legal week. And Mark, thanks for going out and clearing out the DJ force.
Marlene Gebauer 29:14
Thank you both for having us. Indeed. It’s our pleasure. Thanks. And of course, thanks
to all of you, our listeners for taking the time to listen to The Geek in Review podcast. If you enjoy the show, share it with a colleague. We’d love to hear from you. So reach out to us on social media. I can be found on LinkedIn or annex at @gebauerm and on Threads at @mgebauer66 and
Greg Lambert 29:35
And, I can be reached on LinkedIn or at @glambert on X or @glambertpod on threads. Mollie if someone wants to learn more about Redgrave, where do they need to go?
Mollie Nichols 29:47
Sure, on our website at Redgravedata.com
Greg Lambert 29:51
Alright, thank you very much.
Marlene Gebauer 29:53
And as always, music you hear us from Jerry David DeCicca Thank you, Jerry.
Greg Lambert 29:59
You also Hear from the Legal Talk Network
Marlene Gebauer 30:03
They’re a rowdy group