algorithmic accountability

When you think of algorithmic governance, you may go right to things like predictive law enforcement, or risk assessment of setting bail or prison sentences for those in the criminal justice system. However, algorithms have a much broader application in the legal system, far beyond those criminal justice aspects. Drexel law professor, Hannah Bloch-Wehba walks us through a number examples of other areas which algorithm governance is being used. Broad areas which she labels as “typical poverty law settings” of welfare… medicaid… child protective services for example, and those area are continuing to expand. Court systems, administrative law departments, and other government agencies are relying upon algorithms to help with larger and larger caseloads.
Algorithms, in and of themselves, are not inherently bad. In fact, it can be very helpful in streamlining processes and alleviate the burden on different government agencies in how to handle these issues. But is it fairer than what we have now? We don’t have a good way of demonstrating that. Professor Bloch-Wehba sees the overall effect of algorithms as creating a newer playing field that is bumpy in different ways than the old one. There’s still a human element in algorithms, not just in the creation of the algorithms, but also in the acceptance of algorithmic outcomes by those who are tasked to apply them. Add to this, the “black box” which some algorithms live, and how governments are relying upon private industries to create these processes, and an inability for the government to be able to discuss how they work. Can governments give up their duty to be transparent in the name of algorithmic efficiency? How far will a democratic society tolerate with algorithms which it may not fully understand, or trust?
We cover all of these questions and discuss Professor Bloch-Wehba’s upcoming Fordham Law Review article, “Access to Algorithms,” which will be published later this year. (10:35 mark)

Listen on mobile platforms:  Apple Podcasts LogoApple Podcasts | Overcast LogoOvercast | Spotify LogoSpotify

Information Inspirations
Archive and Delete are not the same. Garry Vander Voort of LexBlog writes about a disturbing trend he is seeing on apps where you might think you are archiving a magazine or a podcast, but in reality, you’re deleting it. He has a few suggestions on how developers can use better descriptors, including some good ol’ library terms. (2:27 mark)

Continue Reading Ep. 39 – Hannah Bloch-Wehba on Who is Governing the Algorithms?

[Ed. Note: Please welcome guest blogger, Casandra M. Laskowski from FirebrandLib blog. Cas is a Reference Librarian and Lecturing Fellow at Duke University School of Law, and a total geek – so she fits in well here! I was happy that she reached out to talk about how UX design facilitates discrimination and inhibits legal tech from achieving ethical goals. – GL]

In 2015, Google faced a scandal with its image-tagging feature on its photo app. Despite promising to “work on long-term fixes,” Wired revealed earlier this year that the “fix” was to remove gorillas from its image recognition system, and the system remains blind to this day. This chain of events seems almost cliché at this point: software is released with a predictably offensive or impactful implementation, the company expresses shock and shame, and a hasty patch is made that fails to address the cause of the issue.Continue Reading Legal Tech Needs to Abandon UX