Princeton scholar provides tools to analyze social impacts of technological advancement

KINGSTON, R.I. – Feb. 28, 2024 – There are two popular storylines that envision technology’s unfettered growth and its effects on society.

There’s the Hollywood version, the “techno-dystopian” narrative popular in movies like “The Terminator,” where technology destroys or enslaves humans, said Princeton University scholar Ruha Benjamin during a recent lecture at the University of Rhode Island. And there’s the “techno-utopian” vision, in which technology saves us, making our lives easier, safer and more equitable.

“While these sound like opposing narratives, they actually share an underlying logic—this assumption that technology is in the driver’s seat, propelled by a will of its own,” said Benjamin, professor of African American studies at Princeton and founding director of the Ida B. Wells Just Data Lab, whose research includes exploring the relationship between innovation and inequality. “[The] humans behind the screen are missing from both of those scripts.”

Unseen, their importance is still central. Despite being a “small sliver of humanity,” they design and deploy the technology – from algorithms to AI to big data – that “shapes our shared future.” Behind the screen, their values, assumptions, interests and ideologies impact the technology, many times perpetuating pre-existing social inequities, Benjamin said. 

In her 40-minute lecture before about 400 people in the Higgins Welcome Center and on social media, Benjamin urged the audience to critically evaluate how technological innovations impact society and to help broaden who gets a say in technology’s creation.

“I really want people who don’t identify as a ‘techie’ to know that they have a say,” she emphasized at the end of the event. “You need to have a say as a citizen, as a person living in this world in which these tools are being deployed.… I’m trying to open up the conversation to include people from different disciplines, life experiences and levels of technical expertise. There are other kinds of expertise and experiences that you have that are vital to us thinking collectively about the way [the future] should be democratically shaped.”

Benjamin said it is important to consider the drivers of the technological tools, the context around them, and the racial, labor and environmental impacts when evaluating the benefits of new technologies.

As an example, Benjamin cited a 2021 ad that heralded the new design of Amazon’s Alexa, showing a Black woman imagining a “more flawless vessel” for the virtual assistant–the actor Michael B. Jordan. The ad may seem to disrupt the “prototypical whiteness of technology” or promote inclusive design. But Benjamin juxtaposed the flashy marketing campaign with the fight of Amazon workers to unionize and gain fair working conditions.

“If we asked this black woman [in a news photo of a protest] what her idea of inclusive technology is, would it be sexy Black Alexa?” said Benjamin. “Or would it be changes to the algorithms that are shaping her work, intensifying the pace in which she has to box these items such that, as she says here, they’re breaking us?”

“Moving beyond the buzzwords, looking behind the scene of what’s actually happening is an important starting point for any analysis,” Benjamin said. “And so as a provocation, I want to suggest that when it comes to so-called deep learning and AI, computational depth without social and historical depth is, in my view, superficial learning because it narrows our focus on only the technical dimensions of these technologies and the context gets lost, for better or worse.”

Turning to technology as a quick fix to get at the roots of social inequities, such as race and bias, many times fails to fix the problem, said Benjamin, citing a health-care algorithm, and others, that only perpetuate the implicit bias.

Benjamin answers audience questions after her Feb. 22 lecture with Ammina Kothari, director of the Harrington School of Communication and Media. (Dylan Ruggieri/Harrington School Student)

“These quick tech fixes that don’t get to the roots of the social inequities are often hiding or papering over these realities in ways that we should question,” she said.

In her book “Race After Technology: Abolition Tools for the New Jim Code,” Benjamin evokes the Jim Crow era in discussing how technology is dealing with these practices. “In some ways, it’s more dangerous. At least in my grandma’s generation when she would have walked into a hospital and saw a big whites-only sign, [she would go] to the Negro wing,” Benjamin said. “I can walk through the front door, assuming I’m getting equal treatment, but there may very well be a health-care algorithm making decisions about my care, creating the same pattern of resource allocation, and I don’t even know about it.”

In urging the audience to continue to apply context to analyzing and framing the conversation around technology and society, Benjamin used the example of a public bench with spikes to find the harmful effects that may initially lay hidden in the design.

“For most of us, we don’t build the bench from scratch,” she said. “You’re going to start a new job or an internship and you’re going to start to notice the spikes–this form of harm or exclusion. The question becomes what is your responsibility? Do you just put your head down? Or do you look around and see if anyone else notices the spikes and start to do something about them.”

Benjamin’s lecture was sponsored by the Harrington School of Communication and Media, Center for Computational Research, College of Arts and Sciences, Office of the Provost, College of Health Sciences, College of Pharmacy, College of Engineering, College of Business, and Department of Political Science.