Brad Blickstein, Principal at the Blickstein Group, a research and advisory firm for both in-house and outside law firms, joins us to talk about legal operation, and his recent experiences at the 2019 CLOC Institute in Las Vegas. As with many great conferences, the programming between 9 AM and 5 PM is good… but the conversations from 5 PM to 9 PM (or 5 AM, this was Vegas), are what makes the gathering really special. We’re calling it #CLOCAfterDark.
There’s a lot going on in Legal Operations, and the Blickstein Group has put out a Law Department Operations survey for over a decade. He gives some great insights on the relationships between in-house counsel and outside law firms. While there’s a big difference between the business operations in a company versus a law firm, the attorneys tend to be cut from the same cloth. Groups like CLOC are positioned perfectly to help lawyers understand the roles they need to play to protect their organizations. Blickstein stresses that Legal Operations is a broad topic, and that CLOC is part of that movement, but is not all there is within the movement. There’s a lot going on, and the opportunities are pretty expansive these days. (12:00 mark)

Listen on mobile platforms:  Apple Podcasts LogoApple Podcasts | Overcast LogoOvercast | Spotify LogoSpotify

Information Inspirations
Copyright is not something to LOL about. The Houston Independent School District was slapped with a $9.2 Million copyright violation for copying study guides. Even though they cleverly blocked out the warning on the guides that “copying of these materials is strictly prohibited.” Be careful out there when it comes to thinking it’s okay to copy and distribute materials which have copyright protection. It can cost you millions. (2:50 mark)
AI Sharecroppers. We all know that data is king these days, but not all data can be automatically gathered. At least not effectively. There is an underclass of labors out there who are being used to help gather and identify data needed to power AI programs, known as “human labeling.” As the name “sharecropper” might imply, they do a lot of work… but don’t make a lot of money. (5:25 mark)
Algorithm Problems creates Human Liabilities. We rely upon automation, AI, machine learning, and other technology to advance our society, but when those fail, it’s not the automation that takes the blame. It’s usually the human that is around at the time. MIT Technology Review talks about how we have a 21st Century tech problem that’s being adjudicated under 20th Century morals and laws. (7:05 mark)


Continue Reading

When you think of algorithmic governance, you may go right to things like predictive law enforcement, or risk assessment of setting bail or prison sentences for those in the criminal justice system. However, algorithms have a much broader application in the legal system, far beyond those criminal justice aspects. Drexel law professor, Hannah Bloch-Wehba walks us through a number examples of other areas which algorithm governance is being used. Broad areas which she labels as “typical poverty law settings” of welfare… medicaid… child protective services for example, and those area are continuing to expand. Court systems, administrative law departments, and other government agencies are relying upon algorithms to help with larger and larger caseloads.
Algorithms, in and of themselves, are not inherently bad. In fact, it can be very helpful in streamlining processes and alleviate the burden on different government agencies in how to handle these issues. But is it fairer than what we have now? We don’t have a good way of demonstrating that. Professor Bloch-Wehba sees the overall effect of algorithms as creating a newer playing field that is bumpy in different ways than the old one. There’s still a human element in algorithms, not just in the creation of the algorithms, but also in the acceptance of algorithmic outcomes by those who are tasked to apply them. Add to this, the “black box” which some algorithms live, and how governments are relying upon private industries to create these processes, and an inability for the government to be able to discuss how they work. Can governments give up their duty to be transparent in the name of algorithmic efficiency? How far will a democratic society tolerate with algorithms which it may not fully understand, or trust?
We cover all of these questions and discuss Professor Bloch-Wehba’s upcoming Fordham Law Review article, “Access to Algorithms,” which will be published later this year. (10:35 mark)

Listen on mobile platforms:  Apple Podcasts LogoApple Podcasts | Overcast LogoOvercast | Spotify LogoSpotify

Information Inspirations
Archive and Delete are not the same. Garry Vander Voort of LexBlog writes about a disturbing trend he is seeing on apps where you might think you are archiving a magazine or a podcast, but in reality, you’re deleting it. He has a few suggestions on how developers can use better descriptors, including some good ol’ library terms. (2:27 mark)


Continue Reading