© 2026 Milwaukee Public Media is a service of UW-Milwaukee's College of Letters & Science
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Tech policy expert weighs in on Milwaukee facial recognition tech debate

Facial recognition technology tracks faces on a busy sidewalk full of people.
Leszek Lata
/
Stock Adobe
The Milwaukee Police Department has stopped using facial recognition technology for now. Katie Kinsey of NYU's Policing Project weighs in.

There’s been a lot of talk lately about the Milwaukee Police Department's use of facial recognition technology. It all came to a head at a Feb. 5 Fire and Police Commission meeting, when MPD confirmed they had been using the technology without a policy or the public’s knowledge.

Less than 24 hours later, Milwaukee Police Chief Jeffrey Norman issued a moratorium on further use of the technology. The ban will continue at least until MPD develops a formal FRT policy with the Common Council and Mayor's office.

"Despite our belief that this is useful technology to assist in generating leads for apprehending violent criminals, we recognize that the public trust is far more valuable," Norman said in an email to city officials.

Katie Kinsey presented at the Fire and Police Commission meeting on Feb. 5. She’s chief of staff and tech policy council at the Policing Project at NYU, and she spoke with WUWM’s Jimmy Gutierrez about concerns surrounding FRT.

Listen to the full conversation with Katie Kinsey here.

This interview has been edited for length and clarity.

Jimmy Gutierrez: What are the concerns around facial recognition technology and what are you all finding?

Katie Kinsey: I think sometimes people don't know that there are law enforcement agencies that have been using FRT for over 20 years. Certainly, the technology has advanced in that time. The facial recognition of 20 years ago might not look exactly the same as today. But it's an AI-powered technology that agencies have used for a while.

I think our biggest concerns are that it can be hard to even make a recommendation — even as an expert who's concerned with the civil liberties and civil rights impact of this technology. It can be hard to tell communities that the benefits of this technology outweigh the costs, because there has been so little transparency around how law enforcement has used it.

We don't have good records to say, "Here's the evidence of the number of cases in which it's been used and the outcome of those cases — whether more arrests were made or not, whether it affected certain demographics more or not." We have anecdotes, thanks in part to a lot of good investigative reporting that will tell us things like, "Well, we found false arrests in a certain number of cases because of the way law enforcement used the tool in this particular application."

What we would want in a democratic society are laws that require transparency and a kind of cost-benefit analysis around use, so that we can actually make good decisions about what technologies keep us safe, which ones don't and what kind of guardrails we want to have in place to protect our democratic values. And I think right now we are often operating on a pretty empty record. That's just not a good way to make policy, and it's not going a good way to make people safe.

As you said, FRT is something that's been in use for 20-plus years — which people may not even know about. How should police departments engage with communities around this technology?

There's not one answer to that question. I think it depends on the relationship that an agency has with the community. But I do think you can start with police being more closely connected to the communities that they serve and even asking community members where they're seeing major problems and public safety concerns.

It's important that police departments are listening to those folks more than they are to tech vendors or even agencies in other jurisdictions with stories about how useful the tool was in their neighborhood. That neighborhood might not have the same needs as your community.

It's important to take into account the local community's expertise regarding what need to be kept safe, where they're feeling there are gaps, what they are concerned about. That's a good starting place.

These are sophisticated technologies, and certainly we think they can have public safety benefits in certain situations. But unfortunately, the kinds of stakeholders that are involved in these conversations tend to be tech vendors themselves. They don't have the sort of regulation in place to make them as transparent as they should be, I would say, regarding situations in which their products do or don't work well.

I think engaging with academic institutions and experts in those communities can help provide even just the right questions to ask vendors to prove that these tools work. Make sure they're telling you what data they're using to train their models. Make sure they're showing you why the model that they're developing is going to work on the data that the police department is actually using.

Then, I think it's about auditing and reporting on these tools and using things like pilots. You can say, "We're going to deploy this tool in a limited area for a limited amount of time; we're going to track how it's used; we're going to see whether we're getting a return on our investment; we're going to see how the community feels about it after we've used it for a certain amount of time." And then you can use that limited, time-bound pilot to make sure it's working for both law enforcement and the community.

Infamously, Milwaukee is one of the most deeply segregated cities in the nation. While many community members don't have experience with FRT, they may have experience with MPD and maybe being racially profiled. So — for folks who aren't as familiar with FRT, but who are community members who would be affected — what's something that they should know if they're not following this issue as closely? 

I think one thing to know is that, if you're in a medium to large-sized city in this country, your policing agency is probably using some sort of surveillance technology. And I think what you should know is that the lawmakers in your community — your city council, your state legislators — they have the ability to make sure that those law enforcement agencies are more transparent with the public.

There's been a bit of a movement for agencies to have to develop use-case inventories. The federal government requires this actually of federal agencies, not just law enforcement, to say, "Here are the tools we're using, here's what we're using it for, and we've got a list for you up on our website." That's a good place to start.

You can do that, in most cases, under state law and under city law and start to get that transparency. Those democratic lawmakers are responsive to you as the public. Those democratic lawmakers have the ability in most cases to regulate the policing agencies in their jurisdiction. There is actually power in advocacy and collective action to demand that lawmakers — I would say — do their job and regulate policing agencies like the public agencies that they are.

I know a lot of people are concerned about this data and information being shared with ICE at the moment. What do we know about the porousness of this kind of community information making its way into the hands of the federal government? 

Yeah, I think that's a concern that's certainly becoming more concrete with the way we're seeing FRT expose people to enforcement by ICE in these really difficult situations. I wish I had something more heartening to say about it. I think, you know, there certainly are data protection best practices, data governance best practices around things like deletion and minimization around collection. These are all things that can help make sure there's less data for other entities like ICE to harvest.

Unfortunately, I think our data governance picture in this country is really behind the curve, so it's hard to say. A local community that's working with a national vendor might say, "We're just creating this database for our community, and we have a policy where we're not going to share it with this federal agency." Well, that national vendor might have something in their contract terms that says something very different — and ICE might get it that way.

So, I think it's a matter of really having to be careful about the data that you're giving up to the vendors that you're working with and the data you're keeping on-hand so that it's accessible. I think that collection and storage and the way in which data can cross boundaries so quickly among jurisdictions is one of our biggest concerns, just institutionally. It's something that we think doesn't get quite as much attention as the particular tools themselves. But as we're seeing now, data can be the lifeblood of some of these really concerning enforcement actions taken by regimes that might not be expressing the democratic values that we hold dear.

_

Graham Thomas is a WUWM digital producer.
Related Content