Brad Blickstein, Principal at the Blickstein Group, a research and advisory firm for both in-house and outside law firms, joins us to talk about legal operation, and his recent experiences at the 2019 CLOC Institute in Las Vegas. As with many great conferences, the programming between 9 AM and 5 PM is good… but the conversations from 5 PM to 9 PM (or 5 AM, this was Vegas), are what makes the gathering really special. We’re calling it #CLOCAfterDark.
There’s a lot going on in Legal Operations, and the Blickstein Group has put out a Law Department Operations survey for over a decade. He gives some great insights on the relationships between in-house counsel and outside law firms. While there’s a big difference between the business operations in a company versus a law firm, the attorneys tend to be cut from the same cloth. Groups like CLOC are positioned perfectly to help lawyers understand the roles they need to play to protect their organizations. Blickstein stresses that Legal Operations is a broad topic, and that CLOC is part of that movement, but is not all there is within the movement. There’s a lot going on, and the opportunities are pretty expansive these days. (12:00 mark)
Copyright is not something to LOL about. The Houston Independent School District was slapped with a $9.2 Million copyright violation for copying study guides. Even though they cleverly blocked out the warning on the guides that “copying of these materials is strictly prohibited.” Be careful out there when it comes to thinking it’s okay to copy and distribute materials which have copyright protection. It can cost you millions. (2:50 mark)
AI Sharecroppers. We all know that data is king these days, but not all data can be automatically gathered. At least not effectively. There is an underclass of labors out there who are being used to help gather and identify data needed to power AI programs, known as “human labeling.” As the name “sharecropper” might imply, they do a lot of work… but don’t make a lot of money. (5:25 mark)
Algorithm Problems creates Human Liabilities. We rely upon automation, AI, machine learning, and other technology to advance our society, but when those fail, it’s not the automation that takes the blame. It’s usually the human that is around at the time. MIT Technology Review talks about how we have a 21st Century tech problem that’s being adjudicated under 20th Century morals and laws. (7:05 mark)