On Thursday evening, Dr. Ruha Benjamin, professor of African American studies at Princeton University, delved into the dangers behind technological innovation with the rise of artificial intelligence (AI) in Mills Hall. Benjamin explored the problems with using technology that deepens discrimination while appearing neutral and even benevolent.
Benjamin began her talk with a line from Black feminist writer Toni Cade Bambara: “Not all speed is movement.”
“I can apply that [line] to so many areas of my life, whether we’re thinking about all the things we’re racing towards, in terms of career, in school, but here I want us to apply it to this idea of technological speed, the race to market the things that we have, to skip over in order to beat our competitors, to get our thing out first, and to become more critical of this idea of moving fast and breaking things,” Benjamin said.
Benjamin cited the recent example of the UK government’s cancellation of GCSE exams during the COVID-pandemic. Without the primary testing system for college admissions in the UK, admissions instead used algorithmic methods to predict students’ scores. As a result, schools made up of predominantly lower-income students of color had lower predicted scores compared to students’ scores at wealthier institutions.
“This refers to the way [schools’] postal codes were used to predict scores, which is a reminder for us that technology is not creating the problems, it’s reflecting and amplifying, often hiding, pre-existing forms of inequality and hierarchy,” Benjamin said.
With this context, Benjamin characterizes the current debate over technology’s implementation, in which some believe technological advancement will destroy humanity while others believe it will cause greater progress for the human race.
“There are two stories that we often tell each other about the relationship between technology and society: The first story is what we might call the techno-dystopian narrative, this idea that technology is going to slay us. It’s going to take away our human agencies, take the jobs, the Terminator, the matrix … [The second story] is the techno-utopian narrative. The idea that technology is going to save us, make everything more efficient, more fair,” Benjamin said.
However, to Benjamin, the very premises of these narratives expose an inherent problem of this binary debate.
“While these sound like opposing narratives, they have different endings … they actually share an underlying logic we might call a techno-deterministic logic,” said Benjamin. “This [is the] idea that technology is determining us, but the humans behind the screen are missing from both scripts—the values, the assumptions, the ideologies, the desires that shaped our digital invisible world.”
Benjamin illustrated the discriminatory effects of technology with an example of a public park bench covered with spikes, in which case individuals would have to pay for them to retract. This symbolizes how only the wealthy would benefit from some technological advancements.
“If we think with the [public] bench, thinking about this idea of discriminatory design and asking ourselves what are the spikes that are built … into our digital structures, we could apply it to generative AI, to many other things, looking for the forms of harm and exclusion that are encoded in the structure,” said Benjamin.
Benjamin argued that the critical thinking surrounding the impacts of technology needs to go deeper. She furthers the analogy of public benches covered in spikes to explain the dangers of technology which can take place in more conspicuous forms.
“The more challenging task is to be able to see the insidious spikes that aren’t in our face, that are still embedded in the structure,” said Benjamin. “For other benches, the surface looks very welcoming, inclusive, and diverse. And yet we see that the underlying structure hasn’t changed much. Our institutions are very good at this, using buzzwords, platitudes, statements, to say one thing, but do something very different.”
Professor of Digital and Computational Studies (DCS) Eric Chown said Benjamin’s talk highlights the purpose of the department at Bowdoin.
“[Benjamin] is interested in the exact kinds of issues we’re interested in digital and computational studies: How does technology impact issues of equality, justice, and racism? In many cases, technology is presented as the solution,” said Chown. “[But] algorithms are often just as biased or more biased than people are. And now you have this problem of things that are being presented as fair as being biased. So, Dr. Benjamin is one of the leaders in taking a critical look at how technology often amplifies racism, how it often amplifies inequality.”
Daniel Wang ’26, a prospective DCS major who attended the talk, was similarly appreciative of Benjamin’s perspective.
“The example of spikes was a very interesting way to conceptualize the risks of AI especially,” Wang said. “Dr. Benjamin had a very good way of illustrating these problems through concrete examples.”