What makes technology tick? What fuels all the apps and algorithms of our lives?
Data allows computers to build what’s called neural networks—computing systems inspired by human brains. It’s the fodder from which machines learn, find patterns, and build artificial intelligence.
But here’s the catch: data isn’t neutral. It’s littered with human biases, says Assistant Professor of Computer Science Hannah Wolfe, whose work in robotics and interactive art brings awareness to these biases and takes steps to reduce them.
“We need to be really careful about the data that we use to train machine-learning algorithms,” Wolfe said. “Because they will be as biased as the data that we use as input.”
Take the task of generating text, for example. Ask a computer to generate a sentence and its response depends on the data used in its algorithms. Train a neural network on news sites, Wolfe gives as an illustration, “and the text that the neural network will generate has all the same biases that the writers on those new sites have.”
Wolfe used just such a neural network in her most recent interactive artwork, the award-winning Cacophonic Choir, ideated in the wake of the #MeToo movement. The AI-driven sound installation consists of nine translucent spheres that emit computer-generated narratives from sexual assault survivors. From far away, the narratives are fragmented and distorted. Get closer to a sphere and they become clear.
Wolfe began with a neural network previously trained on Reddit, a social-news aggregation website. But to make the narratives in Cacophonic Choir authentic and specific, she needed different data. She found it on the When You’re Ready project website, where sexual assault survivors share their real-life stories.
Wolfe took a sampling of the stories and designed a machine-learning algorithm to retrain the original network bit by bit until the text that was generated became less random and more like survivors’ stories.
The resulting more inclusive neural network included the language of survivors, most of whom are women. And it makes Cacophonic Choir a powerful example of Wolfe’s work at the intersection of technology, art, and gender.
It’s work that’s garnering widespread attention.
In August Cacophonic Choir won Best in Show–Art Gallery at SIGGRAPH 2020, the world’s leading computer graphics and interactive techniques conference. Media attention from the tech and business worlds followed, as did an invitation to participate in the IEEE VIS2020 Forum. A paper on the project by Hannah Wolfe and her collaborators, Şölen Kıratlı and Alex Bundy, was published in Leonardo, an international journal on the use of technology and science in music and the arts.
The juncture of arts and computer science is of increasing interest among students at Colby, said Wolfe, who relishes the opportunity to work with and mentor them. Students like Rayna Hata ’23, a computer science major with minors in math and Japanese who was Wolfe’s research assistant last summer.
Hata was the only student on the team that created a web version of Cacophonic Choir. “Hannah said, ‘Here’s a brand-new project, and you have to create a virtual world all by yourself. Ready, set, go!’” Hata recalled.
Using the gaming platform Unity, Hata created the virtual room that the spheres occupy in 3D. Next, she used computer programming to modify the visual qualities of the spheres based on their distance to the user.
The opportunity to work on a project like this after just one year at Colby was thrilling, said Hata—and unexpected. She never imagined getting involved in a social issue through computer science. “The ability to help others speak out against sexual abuse and sexual violence as a computer scientist was just amazing.”
Cacophonic Choir and other works by Hannah Wolfe make bold statements about gender representation in technology. In general, she’s “trying to focus on marginalized voices and marginalized people and having technology integrate their stories, their lives, and their experiences.”
Only then will technology represent not just some of us, but all of us.