Ironclad‘s Chief Community Officer, Mary O’Carroll, has spent the past two decades bringing business acumen to the legal industry. In an industry run by lawyers, most of whom had little to no business training, Mary points out that it is logical that legal ops teams are needed to be the right-hand people in helping lawyers in the business process. Her experience with Orrick, Google, CLOC, and now Ironclad has one common thread, and that is the need to drive change. Mary says that it is just a part of her personality to be laser-focused on efficiency and find ways to clean up the mess she uncovers in the legal industry.
It is that desire to drive change through the use of the legal community that helped her make the decision to join Ironclad and the hot field of Contract Lifecycle Management (CLM). Mary points out that the industry has worked to improve efficiency in many areas, but when it comes to contracts, we are continuing to do business as usual. Creating a digital contracting system will help scale the industry, as well as enable us to leverage data, which has always been trapped in contracts, and create new methods for the legal department to help drive the overall success of the business, and no longer be seen as a department where ideas and innovation go to die.
Our own Casey Flaherty advises us to stop trying to be a hero, and learn to say no when it comes to spreading resources too thin. Check out his latest article, “Maybe, Don’t Be MacGyver – The Value of Value Storytelling.”
Singapore is launching a couple of Dalek-looking robots to monitor “undesirable behavior” among its citizens. Is this a logical use of technology or a slippery slope toward technology overreach?
O’Melveny and Myers is the first law firm to join Peloton’s Corporate Wellness Program.
The next time you go through a drive-thru, you may hear the crisp, clear voice of an AI program taking your order. Will the robots take more and more of the service jobs away, and will there be a shift in the way the government taxes those robot workers who replace humans?
If you like what you hear, please share the podcast with a friend or colleague.
As always, the great music you hear on the podcast is from Jerry David DeCicca who has a new album coming out in October!
Brad Blickstein, Principal at the Blickstein Group, a research and advisory firm for both in-house and outside law firms, joins us to talk about legal operation, and his recent experiences at the 2019 CLOC Institute in Las Vegas. As with many great conferences, the programming between 9 AM and 5 PM is good… but the conversations from 5 PM to 9 PM (or 5 AM, this was Vegas), are what makes the gathering really special. We’re calling it #CLOCAfterDark.
There’s a lot going on in Legal Operations, and the Blickstein Group has put out a Law Department Operations survey for over a decade. He gives some great insights on the relationships between in-house counsel and outside law firms. While there’s a big difference between the business operations in a company versus a law firm, the attorneys tend to be cut from the same cloth. Groups like CLOC are positioned perfectly to help lawyers understand the roles they need to play to protect their organizations. Blickstein stresses that Legal Operations is a broad topic, and that CLOC is part of that movement, but is not all there is within the movement. There’s a lot going on, and the opportunities are pretty expansive these days. (12:00 mark)
Copyright is not something to LOL about. The Houston Independent School District was slapped with a $9.2 Million copyright violation for copying study guides. Even though they cleverly blocked out the warning on the guides that “copying of these materials is strictly prohibited.” Be careful out there when it comes to thinking it’s okay to copy and distribute materials which have copyright protection. It can cost you millions. (2:50 mark)
AI Sharecroppers. We all know that data is king these days, but not all data can be automatically gathered. At least not effectively. There is an underclass of labors out there who are being used to help gather and identify data needed to power AI programs, known as “human labeling.” As the name “sharecropper” might imply, they do a lot of work… but don’t make a lot of money. (5:25 mark)
Algorithm Problems creates Human Liabilities. We rely upon automation, AI, machine learning, and other technology to advance our society, but when those fail, it’s not the automation that takes the blame. It’s usually the human that is around at the time. MIT Technology Review talks about how we have a 21st Century tech problem that’s being adjudicated under 20th Century morals and laws. (7:05 mark)
When you think of algorithmic governance, you may go right to things like predictive law enforcement, or risk assessment of setting bail or prison sentences for those in the criminal justice system. However, algorithms have a much broader application in the legal system, far beyond those criminal justice aspects. Drexel law professor, Hannah Bloch-Wehba walks us through a number examples of other areas which algorithm governance is being used. Broad areas which she labels as “typical poverty law settings” of welfare… medicaid… child protective services for example, and those area are continuing to expand. Court systems, administrative law departments, and other government agencies are relying upon algorithms to help with larger and larger caseloads.
Algorithms, in and of themselves, are not inherently bad. In fact, it can be very helpful in streamlining processes and alleviate the burden on different government agencies in how to handle these issues. But is it fairer than what we have now? We don’t have a good way of demonstrating that. Professor Bloch-Wehba sees the overall effect of algorithms as creating a newer playing field that is bumpy in different ways than the old one. There’s still a human element in algorithms, not just in the creation of the algorithms, but also in the acceptance of algorithmic outcomes by those who are tasked to apply them. Add to this, the “black box” which some algorithms live, and how governments are relying upon private industries to create these processes, and an inability for the government to be able to discuss how they work. Can governments give up their duty to be transparent in the name of algorithmic efficiency? How far will a democratic society tolerate with algorithms which it may not fully understand, or trust?
We cover all of these questions and discuss Professor Bloch-Wehba’s upcoming Fordham Law Review article, “Access to Algorithms,” which will be published later this year. (10:35 mark)
Archive and Delete are not the same. Garry Vander Voort of LexBlog writes about a disturbing trend he is seeing on apps where you might think you are archiving a magazine or a podcast, but in reality, you’re deleting it. He has a few suggestions on how developers can use better descriptors, including some good ol’ library terms. (2:27 mark)