Links discussed in this episode:
Contact Us
Transcript
Marlene Gebauer 0:05
Welcome to The Geek in Review. The podcast focused on innovative and creative ideas in the legal industry. I’m Marlene Gebauer.
Greg Lambert 0:12
And I’m Greg Lambert. So this week we are finally, finally getting to a topic that we started the cover back in, I think it was early 2019. And that is on the topic of the right to be forgotten. And, you know, well, we had a number of events that kept our guest, Anne Klinefelter from the University of North Carolina School of Law from being here. We promise we didn’t forget about
Marlene Gebauer 0:36
Yeah, I am finally glad we were able to make this this interview work. We had a great discussion about the current state of the internet and the regulations around the globe, or lack thereof, that protect the users. I really loved her answer to our crystal ball question of where we’re going to be in five years when it comes to data and privacy on the internet.
Greg Lambert 0:56
So stick around for that. But first up, we brought back Molly Huie from Bloomberg Law to talk about the 2022 Bloomberg DEI Framework and some of the changes that they put in the survey this year. So let’s go ahead and listen to Molly.
Greg Lambert 1:15
We’d like to welcome back Mollie Huie. Team Lead, Data Analysts and Surveys for Bloomberg Law. Molly, welcome back to The Geek in Review.
Molly Huie 1:23
Thank you.
Marlene Gebauer 1:24
So Molly, we wanted to have you drop in and give us an update on what you’re doing with the DEI Framework for 2022. Now that it’s live, first of all, tell everyone what the DEI Framework is for those who may not have listened in on the last time you were here.
Molly Huie 1:37
Fantastic. Thank you. Our DEI Framework is a listing of law firms that meet or exceed a standard of diversity. We’re hoping to standardize diversity reporting among law firms. And because we’re publishing a list of firms that make the cut, it makes, makes this usually opaque area much more transparent.
Greg Lambert 1:57
Alright, so we brought you in for one particular reason. And that was to tell us what’s different this year in the 2022 survey, and why you’re adding or changing some of the things that you’ve been tracking.
Molly Huie 2:10
So it’s still a beast of a survey, we are collecting everything. But some new additions this year, we’re starting to ask about neurodiversity. And I know we all talked about that a year or so ago, when the first version came out. You know, we’re not asking for numbers this year. But we’re starting to ask kind of those checkboxes, you know, is it something you’re measuring? Or do you have an affinity group for it? Just, you know, it’s sprinkled in through a lot of different places so we can start to see what firms are doing in this space.
Greg Lambert 2:35
Are you are you defining neurodiversity in a particular way?
Molly Huie 2:39
We are. We’re basically saying kind of broadly, that it’s brains that function differently with examples like dyslexia, autism spectrum, that sort of thing. So we’ve added that we’ve also added a couple of specific questions about origination credit, and about kind of partnership tracking. So those should be hopefully really interesting and get us some really good data.
Marlene Gebauer 3:02
So Molly, remind everyone how firms can participate in the survey.
Molly Huie 3:07
So we have a project page, it’s at PRO.BLOOMBERGLAW.COM/DEI, which I think you all put in the show notes as well.
Molly Huie 3:14
It will be in the show notes. Yes.
Marlene Gebauer 3:15
And right at the top banner of that page, there’s button to submit your data. And when you click on that button, you go to a contact form that you just put your name, your your firm and your contact details. And then I’ll reach out and send a specific access link so that your firm can put all their data in our secure portal.
Greg Lambert 3:32
Do you have a goal for the number of firms that you would like to see participate this year?
Molly Huie 3:35
Last year, we had 28, firms make the list. We had a handful more than that put in their data. I’m hoping we have 75 firms participate this year. That’s a big stretch goal. And I’m putting it out there in the world. So alright, people come on.
Greg Lambert 3:49
All right, well, yeah, make sure that if you are responsible for that, or know someone at your firm that is, pass this along to them. Molly Huie from Bloomberg Law, thanks for dropping in and giving us an update on the DEI Framework. And hopefully, we can bring you back say in October. So when the results are
Molly Huie 4:10
out. Yeah, yeah, more firms that submit data, the better data we’re gonna have to share with everyone. All right, thanks.
Marlene Gebauer 4:15
Thank you.
Molly Huie 4:16
Thanks so much.
Marlene Gebauer 4:20
When the internet began back in the 90s, many of us thought this was going to be the great equalizer for the world and become a utopia of information sharing and learning. Well, it hasn’t quite worked out that way.
Marlene Gebauer 4:33
No it hasn’t.
Marlene Gebauer 4:34
This week’s guest is an expert in data privacy across the globe, and she comes in to share what she’s learned in her time teaching about it both in Helsinki, Finland with the EU laws, and what we are also doing here in the US.
Marlene Gebauer 4:49
We’d like to welcome Anne Klinefelter, Henry B. Brandeis Distinguished Professor of Law and director of the Law Library at the University of North Carolina School of Law. Anne, welcome to the Geek in Review.
Anne Klinefelter 5:01
Thanks for having me.
Greg Lambert 5:03
Well, and we reached back to you, I think way back in in 2019, to try to get you on the podcast. And then we just kind of funny because we were talking about the topic of right to be forgotten and it feels like we we got forgotten along the way on here.
Marlene Gebauer 5:18
Almost forgot.
Greg Lambert 5:19
Almost forgotten.
Anne Klinefelter 5:19
A few things have happened between then and now.
Marlene Gebauer 5:22
A little a little bit.
Greg Lambert 5:23
That’s what I hear. Well, one of the exciting things was you know that you were in Europe at the time, and then we had the pandemic of 2020. So, you know, I think we’re okay to finally pick this back up a couple of years.
Anne Klinefelter 5:36
That’s great.
Greg Lambert 5:38
So speaking of which, you know, you went to Finland, where you were able to teach at the University of Helsinki, how was that?
Anne Klinefelter 5:46
It was amazing. It was amazing, especially for a person who grew up in Alabama to go to a place that cold. That was a big adventure. So I know that you’re not interviewing me to tell you about sauna, but I’m happy to get into that later if you want to.
Greg Lambert 6:01
Absolutely.
Anne Klinefelter 6:03
I was in Finland for the fall of 2019. It was a Fulbright visit was co-sponsored with Nokia, the Finnish company technology company. And I did I taught us privacy law at the University of Helsinki law school. And I conducted research and interviewed policy centers for Helsinki libraries to find out how do they comply with the European Union Right to be Forgotten, or as they now call it the Right to Erasure.
Greg Lambert 6:33
Erasure? That sounds more permanent.
Anne Klinefelter 6:36
It does, doesn’t it? Yeah. Yeah. It sounds less human. Like it’s a tool. There are many different theories, believe it or not about forgetting versus erasing, but same thing really under the law.
Greg Lambert 6:48
Okay, just out of curiosity, I’m assuming since you’re a person from Alabama, you did not speak Finnish.
Anne Klinefelter 6:57
Good point. Yeah. I was actually surprised by how great a fit this trip was. Because it was not on my radar. One of my colleagues and privacy law, Stewart Brotman, who’s a professor at Tennessee, he recommended this to me, he had held the position before and I kept learning more and more and found. Well, they first of all, to your question, they speak English, a lot of people almost everyone in Helsinki speaks English. And Finnish is that most unusual language for those of us who come from an English background it It is unlike a lot of the things we might be more familiar with. So I also found out that the Fins, they love technology. So this is the home of Angry Birds and Nokia phones, which you know, to their detriment were indestructible, so you didn’t need to ever replace them. But now the company’s doing all kinds of exciting things with smart cities. And anyway thriving Finnish company, emblematic of their fascination with innovation design, and a big startup culture. But they also really love libraries. As you can imagine, part of the year, it’s cold. So it’s really great to go inside a library, where their lights and books, they’re, they’re a very introspective people. And so you might be surprised that in honor of their 100 year anniversary of the founding of their country, what they chose to do in Helsinki, was build a grand modern public library. It is stunning. So and it faces parliament, it’s a really beautiful place to be. And their education system is top notch. Plus, I kept learning about the University of Helsinki and found that they had a lot of scholars who do work in areas that I’m interested in data protection and privacy. So it was a great fit for me. And I was lucky my husband works for a company that has an office in Helsinki, so.
Greg Lambert 9:13
You didn’t need to leave him behind? That’s good good.
Marlene Gebauer 9:18
All right, well, let’s turn our sights to the internet for a second. So back in 1994, I think many of us looked at the Internet as the great equalizer. It was, you know, sort of potential utopia of thoughts and ideas and communication. Now in 2022 It’s kind of a hot mess, right? You know, we need it, we use it. It’s part of our everyday life. And we sure couldn’t have been as efficient during the pandemic as we have been if we didn’t have it, but it’s definitely not utopia. So let’s start off with the biggest question that I can think of. Where did we go wrong?
Greg Lambert 9:59
Let’s start off with an easy one.
Anne Klinefelter 10:00
Not easy question. First that, you know, a lot of the internet still is it’s pretty great. There’s so much content. But the economic model is surveillance capitalism. You know, that’s not pretty. So it’s just true that mostly on the internet, you pay with privacy. Sometimes you pay with money too. And we might have continued with advertisements that just match the content of what you were looking at and engaging with. But instead, we have advertisements that specifically target the individual. And we’re now overwhelmed by the invisibility of what’s happening to us. Because as we are tracked, data is collected across platforms and devices and algorithms are applied to create profiles that are used in so many different ways we don’t even understand it may limit what we see, may provoke or manipulate us and all pretty much happening without our understanding about how it comes to be. So that’s, that’s kind of it’s the economic model. You know, people, people defend the privacy payment system as egalitarianism, because poor people could not pay for internet services while the wealthy could. That’s the argument. But if everybody were served contextual ads, that would be a egalitarianism. And if the I would say that the profits are not distributed in an egalitarian way. So it’s, it’s hard to say that, that the way that we’ve set it up is to be cherished, because of its egalitarianism quality qualities. And people also defend the payment system in privacy as a way to respect consumers who can make their own choices. You know, that’s how we’ve set up a lot of privacy as well, you chose to send that information out into the ether. And yes, it got used in ways you could never imagine. But that was your choice. And you are an adult and you’re grown up and we’re not going to interfere with that. So I do a lot of studies, though, that suggests that this is,
Marlene Gebauer 12:24
except you can’t get what you want without giving it up.
Anne Klinefelter 12:28
Yeah, you don’t. It’s hard to say really making a choice.
Marlene Gebauer 12:32
I have I have a follow up question like, um, and it’s somewhat related. But so when you see the cookie notices come up, now and you go in there and you change all of the permissions, you know, how effective is that really?
Anne Klinefelter 12:46
Well, that remains to be seen, we have to assume that that is exercising our choice, and that we are actually making an impact. But there’s so many other ways besides cookies to track people. That’s just one technology. So this is what happens in privacy regulation is that we end up with these very specific provisions and protections and people get excited about it and think they have control and then it reduces the chilling effect on use of all of these tools. And then, you know, it’s like a leapfrog game. You know, there’ll be a new tracking technology, whether it’s some kind of pixel tracking, or it’s just, you know, any number of ways to take what remains besides the cookie information, and pair it with other data that might create a rich profile of you. So it’s not nothing. I am not one of those people who says, well get over it. You don’t have any privacy anyway. I don’t think it’s a dichotomy. Like, there’s so many metaphors, the horse out of the barn. Yeah, sometimes you can bring that horse back, you know, so maybe there’s something else left in the barn, you know,
Greg Lambert 14:04
so I still want to know how they, how they know if I need some kind of kitchen utensil that I’ve just talked to my wife and all of a sudden that shows up on my Facebook page?
Marlene Gebauer 14:12
Or, I just think about it.
Anne Klinefelter 14:15
That is the way people walk around and wonder, you know, what, I don’t understand how is this happening? And, you know, there’s so many collectors of data about us, depending on how many kinds of devices you have, you know, a smart home kind of device gets a lot of information, depending on what you’ve enabled on your cell phone there, all these different things. Now, some companies don’t want to share. So it’s not like those pieces of information are aggregated all the time. But a lot of it is, but there’s not a huge amount of transparency. So we do rely on a lot of scholars and technologists who work to sort of reverse engineer or exploring and test to tell us. Journalists are doing great work to try to help us understand this. Even Consumer Reports is in on this now they’re doing evaluating products with privacy lens.
Greg Lambert 15:15
Yeah, it’s amazing. Every once in a while you find a company that’s gotten caught doing something that the their said they weren’t doing or were hiding on. So it’s really kind of interesting on.
Anne Klinefelter 15:27
That’s a really good point, Greg. I mean, that that would actually trigger consumer protection law. Because deception is one of the authorities that is given to the Federal Trade Commission, and they can investigate and bring actions against companies who are engaged in trade, who have deceived us who promised something and failed to live up to that promise that is really the cornerstone of modern privacy law in the United States.
Greg Lambert 15:59
Okay, well, speaking of data privacy laws, you know, governments, such as the consumer protection agency, you know, have tried for years to step in with regulations on data privacy. But, you know, one thing you have to deal with is that the internet itself is just vast and it’s global. Can you walk us through some of the attempts on regulating privacy from different governments across the world and the effectiveness of some of those?
Anne Klinefelter 16:27
Well, I can speak to some you’re absolutely right, that there are some challenges, because while law has these country, by country or even, you know, large groups like the European Union, jurisdictions are like the United States too. The internet is ambient. You know, it’s just it does not really respect jurisdictions very well. There are some ways but I would say that it’s not as if we’ve done a great job of trying to regulate it in the United States. You do hear a lot about data privacy, but it’s certainly not resulted in any kind of broad Uniform Federal law in the United States, although that is being debated right now. The European Union does have and some other parts of the world, have taken steps to enact broad data privacy regulation. And over here, we have states doing a lot of innovation and, and data protection law. And so California and Virginia and Colorado, have passed laws about limiting the collection use and retention of personal information. Illinois has a biometrics data protection law, and we have a lot of narrow statutes. Because the United States, you know, we don’t want to pass a law until we absolutely have to. We are not a nation of people who love laws, right?
Greg Lambert 17:58
Yeah. We don’t want to strangle business.
Anne Klinefelter 18:01
Bingo!
Marlene Gebauer 18:01
Not unless we absolutely have to.
Anne Klinefelter 18:03
That right.
Marlene Gebauer 18:03
Look how much money they’re making, we don’t want to interrupt that.
Anne Klinefelter 18:06
So we do things like the Video Rental Privacy Protection Act, right? Or larger than that HIPAA, but yep, it’s really fairly narrow. If you think about it, it only applies to health care providers and insurers, and the clearing houses that support that. But you know, if you have a health app on your phone, or something like that, on a device, it may not be covered at all by HIPAA. So we have a lot of gaps in our system. But you’re absolutely right. And about the internet, eluding a lot of these attempts, even when we do make them. And that’s exactly what I was writing about. When I got back from Finland, I got to do some writing with a PhD candidate from the University of Helsinki. And we were looking at a case that was decided, while I was there, by the European Union Court of Justice, saying Google had said, oh, you can’t have the right to be forgotten. That must be enforced around the world. That’s what France was asserting. And so the court agreed with Google, actually, because Google was doing some kind of good enough, bring the horse back in the barn, geotagging kind of, you know, limitation of their of a particular case from France to just people from France. So
Greg Lambert 19:30
Yeah, I guess that makes sense in the fact that it’s a law of a certain country, that I guess Google’s argument was, as long as we’re following it with our google.fr.
Anne Klinefelter 19:41
That’s right. That’s right. So they started that and then they decided to check like IP address who, where they’re coming from, because the the problem was, of course, a lot of people in Europe can speak multiple languages and would be happy to use other search engines. And so that was thwarting the right of the person who was seeking to not have things about them come up in a personal name search that were protected under the European law. So you’re absolutely right. But the truth is that the law is they said it didn’t it, it could have applied throughout Europe, but it didn’t. And that’s because, frankly, privacy has to fight with other, not only just other interests, like all the fun on the internet, but just other interests, like freedom of expression. So those are in Europe, they think of privacy and data protection is human rights on par with expression. And so they think each country might have a different balancing of those.
Greg Lambert 20:45
Yeah. That would be hard to hard to regulate, and hard, hard to implement, I would I would think, saying,
Anne Klinefelter 20:50
Yes, you got it
Marlene Gebauer 20:51
Do you think that the US will ever get to a point where they do something like the Right to Erasure?
Anne Klinefelter 20:57
Wow, that’s a great question. We have a few laws already that they have a component of the right to erasure, but in that very narrow sense of how we do most of our privacy law, like California has a law for minors who posted something on a platform when they were a minor and then regret it later, they can contact the platform to have it removed. And you can imagine how that would be insufficient, if this information has traveled elsewhere. Or maybe the platform’s gone. You know, there’s all kinds of things, but it’s not nothing. And we have, you know, a bankruptcy, that information does expire in terms of credit reports after certain number of years. So that’s like a right to be forgotten. And then we have a tort, a tort claim that sometimes is used to prevent publication of information that might be stale. We’ll call it that, like, because the notion is that people need to have a way to rehabilitate themselves or move on, you know, we have a tradition in our country long ago have a fresh start.
Greg Lambert 22:11
Yeah. Well, one of the things that caused me to reach out to you is I have a friend of mine, who is not a librarian, who brought up the question of, well, in one, I think she kind of confused as archivist and librarians sometimes. But you know, they’re we’re kind of known for librarians being able to collect information, to store information, to share information. And so when the government’s come in and say, We want to be able to essentially erase certain pieces of information, you know, even when they have the best of intentions with that, does that cause the hair on the back of your neck to kind of raise up sometimes and dive into and find out is, is this actually legitimate? Are there going to be some unintended consequences for this? What what’s been your reaction?
Anne Klinefelter 23:04
Well, that that’s actually the basis of my application to go to Finland was to explore that very conflict. If you think about it, libraries, many of them do archival work. And archives alone are committed to exactly what you describe. And they’ve been described as memory institutions. So right to be forgotten, memory institutions coming to a theater near you, big conflict, you can see. So yes, it makes the hair on the back of my head stand up. And and this is the struggle that I have is someone who teaches privacy law. And as a librarian, I’m sort of you know, torn. But trying to balance those things and find a way forward is, is really important. So a lot of ways that this has evolved in archives for years before we had these rights to be forgotten was just the notion that instead of this right to be forgotten, which, you know, is just describing when information gets stale. It might spring into action, you might have a right to be forgotten after a certain period of time, right? That’s the forgetting part. But with librarians and archivists will say, well, that’s fine. But you gotta, you gotta look forward 100 years from now, that’s when we want it. And that’s when the right to be remembered, comes back into action 100 years later, so protect it, and keep it under lock and key, which is really hard. And then we want to have it because it’s history. It is the story of all of us because we really aren’t just individuals. We are also social creatures as we’ve been learning in the pandemic and missing each other, you know, yeah, so that’s, that is a big issue. So, in Europe, you’ll be glad to know that the librarians and archivists did mobilize, and the way that you would expect a good set of librarians to mobilize. And there is an exception for archival purposes. Under the right to erasure. There’s also exception for balancing freedom of expression. So, as you mentioned, hard to implement, but those things are in there. So most of the libraries there, we’re not worried about this.
Greg Lambert 25:26
Yeah, I’ve got a I’ve got a button somewhere around here from Donald Rumsfeld saying, calling us radical militant librarians. So yeah, one of my favorite quotes.
Anne Klinefelter 25:36
Yeah. Yeah, we are quiet. But we can put together a big organizational effort, can’t we? So I think that that would that would be true here should something come. And another thing that happens in the US is even when there’s not something written into law, sometimes courts will say, hey, this violates the First Amendment. So we’re gonna have to have that’s the way we got fair use and copyright was from a court originally, as sort of a way to accommodate the First Amendment.
Marlene Gebauer 26:08
Alright, so I’m going to move away from the courts, and I’m going to go into the metaverse. Oh, boy,
Anne Klinefelter 26:13
I’m sure it’s in the metaverse, I don’t think
Marlene Gebauer 26:17
yet, you know, how did the new features, you know like the metaverse platform create, you know, new problems when it comes to an individual’s privacy rights on the internet.
Anne Klinefelter 26:30
Where that is where all the fun comes in. And privacy law is that you have to take laws that were created a lot of them in the 1970s and apply them to new technologies and contexts, like the metaverse. But some of the laws are more recent. I would say the the Illinois law about Biometrics is likely to be a barrier. They require consent for a lot of the face recognition and biometrics that underlie a lot of the AI that is part of the metaverse. And then we also have the Federal Trade Commission’s Consumer Protection Authority to look and see if some secondary uses, I like to say, which means you know, something you didn’t expect is happening to the data that you allowed to be collected. Some of that might be unfair, it might even be it might be deceptive. And we’re going to be looking at how the AI might provoke concerns like that about, you know, all the data that’s collected is it’s astounding how many conclusions you can draw about people from things that we think are innocuous, but the tools can derive some very sensitive conclusions about your health, your religion, all kinds of things you would think oh, no, that’s not. That’s not obvious. But it’s, it’s derived. But there’s also the problem of discrimination because we’ve seen some of these face recognition algorithms result in discrimination. That’s actually a live topic right now. So I think that there will be some some struggles and some pushback, there will be some, some folks who will be pressing back as the metaverse moves forward. We’ll see.
Greg Lambert 28:32
Yeah, I’ve heard situation. This is all anecdotal. But things like being able to track somebody’s mouse movements or imagine with like an with an eye visor, Oculus thing that you can track eye movements or head movements and get actual real health information. Yeah. On these folks. And I, you know, I’ve heard people warn that this is something that there are people that are looking into how do you track that without them without people actually knowing that you’re tracking? So I think there’s, you know, I would say the more you connect yourself, and especially when you physically connect yourself, that you’re going to be able to give more privacy information out with or without knowing. Just just just my anecdotal talk there.
Anne Klinefelter 29:18
That’s, that’s me, you know, they they’re tracking those eye movements now and test taking software to
Marlene Gebauer 29:25
well, this kind of touches on my final question. So we’re gonna, we’re talking about sort of what’s, you know, what’s sort of out there? So, this is, this is what we call our crystal ball question. So, okay. You know, we ask guests to sort of look into their crystal ball and peer into the future for us. So what kinds of changes or challenges do you see over the next five or so years when it comes to privacy? You know, whatever the internet turns into at that point?
Anne Klinefelter 29:52
Well, since you suggested that was it. 94 We were supposed to think that there was Utopia On the internet, I’m gonna go, I’m gonna go with utopia. And, oh, here’s how we’re going to get to utopia. First things are going to get worse. Right? It might be necessary to for things to get worse and obviously worse, in order for Utopia to emerge, people start to see about the applicant screening pool for that discriminated against them when they were trying to get a job, say or make some noise, or I did not give you permission to use my image for your biometrics database, and I’m going to make some noise. And sadly, weak privacy or data security will cause more infrastructure disruptions, financial harm, and even more violence as angry people can, you know, get home addresses on people and do damage, and maybe even you know, are stirred up into more damage activity. So a lot of these things, they’re already happening, I expect more. And then there will be a sense of a bounce, I think, for privacy. And all this time, I think we’re already starting to see and we’ll see more of privacy enhancing technologies, we like to call them pets, and policies. So people will adopt more pets. And we’re going to see, it’s not just duck, duck go and VPN. That’s just the tip of the iceberg. There’ll be AI tools to anonymize video, differential privacy techniques, all kinds of ways to gain utility from datasets without the risk of identifying individuals, services for collecting the data that that businesses hold about you. I mean, things you could do, but they’re very time consuming, cumbersome to do, I think there will be a lot more of this coming into play. And there will also be companies who care about this. They think it’s part of their business model and the reputation, and so there’ll be investing in it too. And then my third factor in the emergence of utopia is that somebody, somebody and I don’t know who but somebody who’s made obscene amounts of money from this surveillance, capitalism economy, is going to choose to create privacy hubs at local libraries across the country, right. Just like Andrew Carnegie, who built libraries across the US distributing his railroad and steel wealth, we’re gonna see some people who are actually modern beneficiaries of these new industries, actually target privacy, trying to promote it through the local library,
Greg Lambert 32:52
like Jeff Bezos terminals that?
Anne Klinefelter 32:57
Well, you know, libraries can be privacy intermediaries, for people who can use the library’s IP address passwords. And librarians can give guidance for helping people get access to information they need for jobs, education, and just fun. Good. So I think that’s, that’s how we’re gonna, how we’re going to get there. It’s gonna be utopia. It’s gonna get bad.
Greg Lambert 33:22
I hope to hope to see you on the other side of this dystopia into this utopia that we’re Yeah, we’re gonna
Anne Klinefelter 33:30
talk again in five years.
Greg Lambert 33:31
Yes, thanks so inclined filter from the University of North Carolina School of Law. Thank you very much. I hated that. It took us two years to get you on. But I’m so happy that we got you. Yeah, that’s
Anne Klinefelter 33:43
a pleasure. Thanks.
Greg Lambert 33:47
Well, it was good to finally finally, finally, claim filter on on the show. So I mean, if you’re, if you’re going to forget about something, this probably is the one one to do. But again, I promise we didn’t, we didn’t forget. But I really have to point out what she said there at the end about the internet. And her view that it will get better that it may actually become the utopia that we have, but But it’s gonna get worse that is going to get worse. Which, which I think I think is is probably spot on. And while I appreciate her enthusiasm for the next group of Andrew Carnegie’s out there to come in and use the Public Library system to to be these kinds of safety zones for internet privacy. I don’t know if that I’m holding my breath for the bank of Jeff Bezos terminals at the local public library, but you know, fingers crossed your mind. Maybe that will happen. Exactly. So thanks again to Anne Klinefelter from the University of North Carolina School of Law for coming in and finally talking to us.
Marlene Gebauer 34:59
Thank Thank you and glad we could make it work. So thanks to all of you for taking the time to listen to The Geek in Review podcast. If you enjoyed the show, share it with a colleague. We’d love to hear from you so reach out to us on social media. I can be found at Gabe Bauer am on Twitter
Greg Lambert 35:13
and can be reached at glamoured on Twitter,
Marlene Gebauer 35:17
Or leave us a voicemail on The Geek in Review Hotline at 713-487-7270 and as always, the music you hear is from Jerry David DeCicca Thank you, Jerry.
Greg Lambert 35:26
Thanks, Jerry. Alright, Marlene, I’m gonna go find those safety terminals at the public library.
Marlene Gebauer 35:33
Go for it!