Ethics of AI
Angela Glover Blackwell in conversation with Rediet Abebe and Terrence Wilkerson
Artificial intelligence and algorithms are increasingly used to make life-changing decisions in policing, lending, hiring, renting, health care, and many other realms. The technology has come under fire for encoding and intensifying racial bias. But what if AI could be transformed into a tool for fighting discrimination and inequality? Host Angela Glover Blackwell discusses this intriguing possibility with Black in AI co-founder, activist, and computer scientist Rediet Abebe. We also hear the story of Terrence Wilkerson, who was unjustly trapped in the criminal legal system by questionable AI technology.
Rediet Abebe is an Assistant Professor of Computer Science at the University of California, Berkeley and a Junior Fellow at the Harvard Society of Fellows. Abebe holds a Ph.D. in computer science from Cornell University and graduate degrees in mathematics from Harvard University and the University of Cambridge. Her research is broadly in algorithms and artificial intelligence, with a focus on equity and distributive justice concerns. As part of this research agenda, Abebe co-founded and co-organizes the MD4SG initiative and is serving as a Program Co-chair for the inaugural ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO '21). Her dissertation received the 2020 ACM SIGKDD Dissertation Award and an honorable mention for the ACM SIGEcom Dissertation Award for offering the foundations of this emerging research area. Abebe's work has informed policy and practice at the National Institute of Health (NIH), the Ethiopian Ministry of Education, and the United Nations Food Systems Summit. Abebe also co-founded Black in AI, a non-profit organization tackling equity issues in AI. Her work is influenced by her upbringing in her hometown of Addis Ababa, Ethiopia.
Terrence Wilkerson is a family man and the proud father of four daughters. Born and raised in the Bronx, Terrence first encountered the criminal legal system at a young age. He was twice wrongfully accused of crimes he did not commit: once at age 19 and again at age 40. He now shares his story with the goal of changing the system that harmed him and his family.
*Terrence is pictured with his attorney.
Angela Glover Blackwell: (00:06)
Welcome to the Radical Imagination podcast, where we dive into the stories and solutions that are fueling change. I'm your host, Angela Glover Blackwell. Every single day, artificial intelligence and algorithms are used to make life changing decisions on behalf of people. Yet these formulas faced little oversight. From discriminatory hiring systems to predictive policing crime sentencing, or access to loans, algorithms often exacerbate existing biases and inequality in our society. In today's episode, we look into the impact that AI is having on people's lives. And the brave efforts led by Black women, computer scientists, and activists who are working to transform algorithms to help fight inequality and inequity, instead. For more on this, we're joined by Rediet Abebe. She's an assistant professor of computer science at the University of California, Berkeley and co-founder of Black in AI. Rediet, welcome to Radical Imagination.
Rediet Abebe: (01:08)
Thank you so much for having me. It's great to be here.
Angela Glover Blackwell: (01:11)
Give us some examples of how algorithms can impact people's lives,
Rediet Abebe: (01:15)
You know, in the public sphere, which is the space where I spend a lot of time, education is obviously big and assignment of students to schools as one example, but it impacts people's lives in housing. If you're applying for say to rent an apartment, say in New York City, your landlord could be using an algorithm to decide whether you're going to get that apartment or not. It's used in credit scores, of course it's used in job applications. We know that people are using algorithms for resume screening. It's used in so many different ways in criminal justice. It's being used to investigate. It's being used to police. There's predictive policing that happens. There's risk assessment tools to decide whether someone's going to be let out on bail. Uh, it's being used to investigate like facial recognition. It's also being used to prosecute.
Rediet Abebe: (02:05)
There are algorithms that are being used to actually decide whether someone is guilty or not. And, uh, they're being used out of their original scope. And so as they are sometimes getting things wrong. So as we speak, there are people in prison who did not commit the crimes that the algorithm, you know, falsely kind of directed us into thinking that they did. And so you're in this situation where people could end up being wrongfully convicted for crimes that they didn't commit because some algorithms, some software is being used outside of the original scope for which it was planned. To me, that's the scariest thing that could happen. And of course, you know, our lives are now online. And so search engines use algorithms obviously, but really anything that you were doing in the public space, which is where I do a lot of my work. A lot of decision making that used to be done by humans is now kind of being moved to being automated in some form.
Angela Glover Blackwell: (02:59)
And one of the things that you didn't mention that I think about a lot is that it used to determine whether or not you can get alone, which is so important for where we don't show up as people of color. Because of the racial wealth gap. We don't have a lot of history. Don't own a lot of things.
Rediet Abebe: (03:13)
And people use different proxies people use people's zip codes to make a decision, right? So it's not even what your credit history is or whatever it is that you're bringing to the table. They can also use your zip code. And if you live in a Black neighborhood and it automatically holds you back. And so there's a great deal of poorly conceived uses of algorithms out there that are doing a great deal of injustice.
Angela Glover Blackwell: (03:35)
A lot of these things do sound scary because where you hope that some person, a compassionate person is taking your particulars into account. To find out that this is all just happening with algorithms is unsettling, but go a little deeper and talk about how they disproportionately impact communities of color.
Rediet Abebe: (03:56)
Well, I'm one of the co-founders of Black in AI with my colleague, Timnit Gebru, and we as part of Black in AI, we have this technical workshop series. I think it was two years ago. We had a keynote by this man, a Black guy, his name is Terrence Wilkerson. He had been wrongfully convicted for crimes that he didn't commit. And in some situations there was sort of like an algorithm that was involved. And we talked to him, he gave this talk and he gave us wonderful Q&A and something that he said really stuck with me. He said, it's not even about what's right and wrong. It's about, you know, my humanity essentially, right? I want to be judged as a human by a human. Because what he was saying was look, you know, I'm, I'm this person. I have kids. I am an entrepreneur.
Rediet Abebe: (04:39)
I have, you know, he is a person he's bringing his whole being into the stand when he's in front of a judge. And to have that being replaced by this not really accountable algorithm, that just doesn't look at you as a human being. But rather it looks at you as a bunch of data points is an extremely dehumanizing process. It was really interesting to me how he emphasized that he wasn't even talking about the outcome. He was talking about the process, right? He was not even happy about the process. And I think this very closely mirrors this country in which we live, right? This is the dehumanization of Black people. The injustice that Black people have had to suffer. It's not news in this country. And so to see that injustice being automated, building these tools without input from communities that are getting the shorter end of that stick, [inaudible]
Rediet Abebe: (05:30)
Building these tools by people who are never going to be negatively impacted by it is unfortunately unsurprising, but we are entering a really scary phase of what's to come. [inaudible]
Angela Glover Blackwell: (05:45)
Before we continue the conversation with Rediet, we're going to introduce you to Terrance Wilkerson. One example of algorithmic bias and injustice she brings up. Terrance got caught up in the criminal legal system and later was flagged as high risk by a bail-related automated system years after having been falsely accused of robbery. I sat down with Terrance for a phone interview where he shared more about his journey through the criminal legal system. For years, Terrance successfully fought against accusations that could have left him behind bars, so much longer than they did. Terrance, welcome to Radical Imagination.
Terrence Wilkerson: (06:23)
Thanks for having me.
Angela Glover Blackwell: (06:24)
Where did you grow up? Paint us a picture of what it was like to be in New York in those years.
Terrence Wilkerson: (06:29)
I'm gonna have to say exciting and entertaining. Cause I grew up around the Harlem area, the Apollo area right around the corner from the Apollo. So there was a lot of things going on and scary things at the same time and you know, nothing too drastic. And so I got to the Bronx. So you got to prepare yourself and I wasn't prepared. So when I got to the Bronx, I noticed there was a lot of junkies and bad looking buildings and a lot of robberies and murders going on, then.
Angela Glover Blackwell: (07:01)
What do you remember about the day when you were first arrested? In 1998?
Terrence Wilkerson: (07:07)
I went out with a friend. We was, um, doing a little marijuana joint at the time. So the cops had kind of pulled up. So I plucked the marijuana to the ground and they wound up finding it. So we both got arrested. So we were sitting in a cell and I was thinking like, this ain't about nothing, we'll be out. It's just a marijuana joint. So when I went to go see my lawyer, my lawyer, he says, oh yeah, you getting out for this. I said, I know. He said, yeah, but you got a problem. There's two detectives waiting for you. So I said why? And he said, 'armed robbery.' That's when things went haywire for me. From a marijuana joint, I wound up being charged with two different armed robberies. I went in front of the judge, and they said, you're released, but you gotta be detained by the two detectives. They going to take you out and ask you some questions.
Terrence Wilkerson: (08:11)
Next thing, you know, I'm sitting in front of a lineup. I was new at the time. I didn't know what this was about. So I guess I was called a fill-in of people who were suspected of robbing someone. And I guess somebody pointed me and said, I was the one who did it. Next thing you know, I had to go in front of a judge and I believe the judge gave me a bail, which was like $35,000. I wound up being sent to Rikers Island to fight my case. It seems like I was in a losing battle. So I wound up pleading to something I didn't do, to receive one to three years.
Angela Glover Blackwell: (08:46)
How did you end up coming to the conclusion that this was the only way that you could get your freedom back, o to plead guilty?
Terrence Wilkerson: (08:55)
They woke me up that morning and said I had court, which I didn't even know I had court. I was in that courtroom. I never been in a courtroom. I know 8, 7 30, 8 o'clock in the morning. I ain't even know the courtroom is over, but I was called into the courtroom with the judge and my lawyer. I didn't have no family members in the audience at the time because it wasn't time for court. So the judge says the jury is right outside this door here. You can either plead guilty to the one or three, or you take it at trial, take your chances at trial could've brought, I believe 25 or something. Not sure. I was pretty young at the time. Not even like 18, 20 -- not too sure. Once I looked back and I seen nobody was there. I didn't know what to do. So I just kinda cried, bust out, uh, poked my little chest up and, and said, I'll take the one to three. And I wound up going back upstate for another year and a half for a crime I never committed.
Angela Glover Blackwell: (10:03)
Terrence. How did this experience change your life?
Terrence Wilkerson: (10:08)
For me, it kept me isolated from people. I don't trust no one. It's hard to trust anyone.
Angela Glover Blackwell: (10:17)
Now I understand that that's the beginning of your involvement with the criminal justice system. What happened the second time that you were arrested?
Terrence Wilkerson: (10:26)
Now, this time it's different because now I'm actually not on the street. I'm in my little sister's home, cleaning. So I figured right now is the time I can go outside, house is clean. Now I would like a cigar. So I go right outside my building; right on the side of my building is a grocery store. So I went to the window, but it felt like, you know, it was a lot of people behind me. So I turned around and it's a crowd. I see like police. And, but I ain't paying no mind, nothing to do with me. I got the stuff and went right back in the building. Three minutes later, not a knock. All you hear is, "boom, boom." They kicked the door, walks in, looks at me and says, what's your name? Now, right there, out the gate, I'm already prepared for this.
Terrence Wilkerson: (11:26)
Like, first of all, I'm not going to tell you nothing. And first of all, what are you even doing it here? That was my exact words. Sitting on my chair, him and his partner walked right to me, stood me up, took me out of the house, stood me in front of the stoop. I just finished walking off of and told me that look straight. Next thing you know, I'm being placed inside a police car. Next thing you know, I'm being driven to the precinct. Next thing you know, I'm in a cell. I still don't know what's going on. Finally the officer walks by, so I say, I'm like, somebody's got to tell me what I'm in here for. He don't tell me; he passes me the paper. Here we go again, armed robbery. I just robbed somebody for $600 and a cell phone. Right from there, once he told me that in that cell, the crying is over.
Terrence Wilkerson: (12:22)
I stopped crying in 98. So now I'm just pissed. But when I went in front of that, judge, the judge say, okay, what evidence do you have against him? The DA says, none, your honor, we don't have nothing. At the time they were supposed to have a cell phone and money, nothing on my possession. They say they don't have nothing at the time. And the judge said, okay, bail set at such and such. The same thing, okay, back in the cell, I go. So now I'm crying. I'm being shipped back to Rikers Island to fight the case again.
Angela Glover Blackwell: (13:01)
And here you are again, but it was $25,000 bail. In a previous conversation with your lawyer, they explained that this time you were being considered as high risk of not showing up for your court date or of committing another crime. That high risk label was put on you by what is known as an algorithmic risk assessment tool and that tool or software used your false criminal history to make this determination. How does that make you feel?
Terrence Wilkerson: (13:31)
Oh, it put me in a hospital. I take [unintelligible medicine] now messing with that second case. That was that's the bottom line to that. I just went through it and I'm not, I haven't been in one single bit of a trouble since 98, until that problem. And I'm still being judged. I'm still getting a bail set. I'm still considered not coming to court. How am I not come to court? If I don't have a court case to go to, but you still got me as a high risk or not coming in. Okay, I'm confused. I have a computer or contemplating or how, how much of a risk I am that don't even make sense?
Angela Glover Blackwell: (14:10)
Why doesn't it make sense?
Terrence Wilkerson: (14:11)
I might say if you, um, you went off for the first time I got arrested in my whole entire life and say, I didn't get arrested in and say chain years after that or whatever the case, that thing must still be considering me a high risk. That's not, that's not fair.
Angela Glover Blackwell: (14:27)
You had kids, you had a job. You were interacting with your sister and your family and your brother. You had been through something that wasn't right in the first place. And you had made adjustments. Does the computer know any of that?
Terrence Wilkerson: (14:41)
And still being judged the wrong way; back to that algorithm thing that was judging, how are you going to see a person you can't even see? Who's programming these things? Uh, how was this being programmed. I don't get it. I stays in house inside. I don't walk the streets alone. And when I do, I make sure I'm protected by cameras. Everything is a calculated snap, now I've stayed close to my family. Anybody tell you, they're telling me where you're going. Right? Make sure you go straight to the house. Call me when you get there. These are the things I got to go through, every single day, now, just to keep me safe.
Angela Glover Blackwell: (15:30)
Terrence, thank you for speaking with us.
Angela Glover Blackwell: (15:34)
Terrence Wilkerson is a father and survivor of the criminal legal system. Terrence and his lawyers fought his case and the robbery charges against him were dismissed in an acquittal by jury trial. Coming up on Radical Imagination, we continue the conversation with computer scientist, professor and co-founder of Black in AI Rediet Abebe.
Angela Glover Blackwell: (15:56)
Stay with us, more when we come back.
Angela Glover Blackwell: (16:19)
And we're back.
Angela Glover Blackwell: (16:20)
When we left our conversation with computer science, professor Rediet Abebe, she talked about the impact of faulty algorithms on the lives of Black and Brown people. Now let's shift our attention to the radical work she and others are doing to transform the use of AI for social good.
Angela Glover Blackwell: (16:38)
So you're getting right into why it is so important that we really aggressively have more diversity in the field. Talk about the diversity, how we get there and what the impacts are when we have it.
Rediet Abebe: (16:50)
Right, yeah. So I think a lot of times we think of diversity stuff as a sort of afterthought, right? We've already decided what our criteria is for admitting undergraduate students, graduate students, faculty, whatever it is, you know, you've, we've already come up with a bunch of things. And then we kind of look at the outcome and we say, oh, there's not that many Black people. I guess we should probably think about diversity, right? It becomes an afterthought. And that's just been detrimental to everyone. It really, really has been. I really think that every single person has, is losing as a result of this. And so I think it's no surprise that the area where I've had the most success in terms of just being able to articulate a vision that I think is compelling and that others also might agree. It's not a surprise that that's also the area that's impacted me and my communities directly, right?
Rediet Abebe: (17:39)
Because to me, it's not something I check in at 9:00 AM, and then I leave at 5:00 PM. It's not a nine to five job. This is my life. This is my people's life. And so I understand better as a result, what is going wrong here and how are people being impacted by it on the ground? Right? And so I think this recognition that if we artificially constrain ourselves to have a subset of society, we're also going to constrain the perspectives that would be on the table, that input and the ideas and the innovative potential solutions that are on the table. And I think it's a cruel decision to sort of artificially shut out a bunch of people in this way. And so it makes sense to me that people who have to live these problems are also the ones that are for whom there's a lot at stake and maybe actually have, you know, very rich perspective on what would be the way forward.
Angela Glover Blackwell: (18:25)
Now you are talking about the disconnect between the lived experience and people who are researching and understanding and thinking about artificial intelligence and algorithms. But there's a different disconnect. And that's the one between people who introduce and deploy algorithms and those who regulate; the disconnect between the policy makers and the field. Uh, you have been working on that too with the California policy lab, right?
Angela Glover Blackwell: (18:52)
I'm, I'm working with Berkeley folks, you know, what would it look like for us to train, let's say, graduate students in computer science and AI ,who are also doing policy driven work, what would it look like to link them up with the California Policy ab? Right. And to have that be part of their training, not just something that they do after they finished their graduate degrees. And so there's a lot of work to be done here about how to create those links, because we have those things with industry, but we don't necessarily have them with policy. And we need to do that.
Angela Glover Blackwell: (19:26)
I sometimes wonder, even though marginalized communities really can be hurt through the use of algorithms, do you find that there are uses that can actually fight inequality and inequity?
Rediet Abebe: (19:40)
Yes. Yes. And that is where I come in. Right. So I think that yes, algorithms are harming people and we should stop that. But what would it look like if the field of algorithms was really being led by marginalized communities, how would we use it? Do we build the same things that have been built or do we build something different? And that's, I think where I feel an immense amount of opportunity and honestly, even optimism
Rediet Abebe: (20:13)
There are a lot of decisions that we've made about criminal justice, independent algorithms that are very dehumanizing. What's happening is that you take this problem and then you basically put it on steroids. So when you add a sort of automated system to it, and so I think people are coming to this collective recognition that there were these issues of discrimination and injustice and inequality that pre-date any automated system and we've made it worse. And I think we should fix up, but also actually we don't totally understand exactly how inequality and discrimination always works. And so there is an opportunity here for algorithms to be used to come up better measurements, right? To understand the extent of some form of discrimination or inequality, to try to come up with interventions that can help move the needle in the right direction. Um, and things like that. And so that's something that I'm really, really excited about. This positive use of algorithms for equity and for justice.
Angela Glover Blackwell: (21:04)
How does mechanism design for social good go about its work and describe an area in which you're seeing the application?
Rediet Abebe: (21:12)
In terms of how we go about our work. Um, what we do is we understand that equity and justice driven work requires really truly understanding what the problems are. And you can't do that in like a computer science bubble that just isn't going to happen. There is a working group specifically on inequality. And we are working closely with policy makers, with working with organizations like Benefits Data Trust, which is a nonprofit organization that is working to make public benefits accessible to a wider range of the population. Right? Because we know that there's a lot of barriers that people face. We're working with the state of California to understand what holistic allocations look like. Right. Because right now, a lot of times what we do is we say, oh, someone is experiencing housing instability. Let's try to help them there. Someone is experiencing, let's say food insecurity.
Rediet Abebe: (22:01)
Let's help them, their education. You know, we think of it in this like very siloed way, but the reality is that people experience multiple forms of disadvantage all at once. So the solutions that we're bringing to the forefront should also be in recognition of that. And we don't always do that. And so we're working with the California Policy Lab to understand what would a holistic allocation even look like in this space. So ensuring that the research questions that we ask are driven by policy and practice, but then also making sure that the research that we do ends up feeding itself back into policy and practice.
Angela Glover Blackwell: (22:37)
I am enjoying talking to you because you're bringing so much imagination to an area that is well known among the people who are doing it. But I tell you, this is a black box for most of society and it's impacting their lives in the most profound ways. And the horse is well out of the barn, is there still time for us to change course and fix the mechanisms that are currently exacerbating inequality and inequity?
Rediet Abebe: (23:03)
My grandfather fought in the Ethiopian Italian war. He died when he was quite old, actually. So I got to meet him and talk to him several times. And it was really interesting to me to see how he approached his contribution there, you know, it was really difficult for him, but he was like, look, we're fighting for something that we may not necessarily get to live and experience. Right. And I think the struggles here are kind of similar, right? It's that, yes, we're pushing against the use of algorithms in the space, but we're pushing against decades and centuries of discrimination and injustice. I don't expect that in my lifetime, you know, things will happen and I'll feel like that battle is won. But I think if we're all just kind of making sure that we're moving things on a positive direction, things will keep getting better. And so I try to maintain that perspective.
Angela Glover Blackwell: (23:58)
You are certainly right. And as someone who has been in this fight for decades now, I must say this moment feels more right for change than any i have seen. This has been fascinating. Rediet, thank you so much for talking with us.
Rediet Abebe: (24:14)
Thank you, Angela.
Angela Glover Blackwell: (24:18)
Rediet Abebe is an assistant professor of computer science at the University of California, Berkeley and co-founder of Black in AI.
Angela Glover Blackwell: (24:25)
Technology is almost never inherently good or bad. It's all about how it's used, toward what end. As Terrance Wilkerson's story painfully illustrates, right now, algorithms, machine learning and artificial intelligence caused enormous harm to people, society already marginalizes and discriminates against -- people of color, especially Terrance sums up the problem perfectly. How are you going to see a person you can't even see? Those words lay down a challenge that Rediet Abebe is taking on. How can we use advanced technology, with its ability to recognize patterns, not as a tool, to further marginalized and oppress people, but as an instrument that helps create broad wellbeing and belonging?
Angela Glover Blackwell: (25:16)
This is the opportune moment to push that question. As the federal government, the private sector and civil society have all made commitments to racial equity, the essential element for belonging. They must work together to rethink the design and applications of these technologies. Certainly leaders have a responsibility to stop harms these tools cause, but that's only a starting point. Let's harness the power of our best technology to repair the harms of centuries of structural racism, expand opportunity and build an equitable society in which all can participate, prosper and reach their full potential. We and our partners at Unfinished invite you to Reflect and Respond to this question, "What is essential about you that you would hope algorithms could reflect?" Submit your at radicalimagination.us or on social media using #RadicalImagination and #ThisIsUnfinished.
Angela Glover Blackwell: (26:32)
Radical Imagination was produced by Futuro Studios for PolicyLink. The Futuro Studios team includes Marlon Bishop, Andres Caballero, Ruxandra Guidi, Stephanie Lebow, Jess Alvarenga, Julia Caruso. The PolicyLink team includes Glenda Johnson, Rachel Gichinga, Ferchil Ramos, Eugene Chan, Fran Smith, Jacob Goolkasian and Vanice Dunn. Radical Imagination is supported by Omidyar Network, the David and Lucile Packard Foundation, Pivotal Ventures: a Melinda Gates company, and Unfinished. Our theme music is composed by Taka Yusuzawa and Alex Segiura. And I'm your host, Angela Glover Blackwell. Join us again next time. And in the meantime, you can find us online at radicalimagination.us. Remember to subscribe and share.
Angela Glover Blackwell: (27:24)
Next time on Radical Imagination.
Upcoming Episode: (27:38)
Vaccine makers under pressure for the second day in a row after the White House says it backs waiving intellectual property rights.
Angela Glover Blackwell: (27:46)
The patent industry and the Covid vaccine. That's next time on Radical Imagination.