MIT scholar says she is working to create ‘feminist AI’
Scholars raised concerns about how artificial intelligence may define the word “woman” or be used to perpetuate “colonialism” during a “Gender and AI: Promise and Perils” conference Friday at Harvard University.
The Harvard Radcliffe Institute organized the two-day event to “explore present and future challenges and opportunities posed by the intersections of gender and artificial intelligence (AI).”
In order to understand the problems, computer science Professor Fernanda Viégas began by giving a general overview of how AI works.
In traditional programming, computer scientists have to tell the computer every rule in order for it to function properly, the Harvard professor said. With AI, however, they give the computer huge amounts of data and then allow it to figure out its own rules, she said.
This is both a “blessing and a curse,” because computers can sometimes figure out patterns that humans miss, Viégas said. However, AI often uses “biased” data to identify patterns, she said.
For example, how AI figures out the definition of the word “woman” is based on the data it has access to, she said.
“Autocomplete, as simple as it sounds, is incredibly powerful,” Viégas said. “Everything rests on the data we feed it. So if we feed it very biased data, we are going to get very biased systems to come out.”
In a panel discussion next, Massachusetts Institute of Technology Professor Catherine D’Ignazio said data input is a “huge” and “unsolved” problem with AI. She teaches urban science and planning and serves as director of the Data + Feminism Lab at MIT.
Machine learning and AI are “status quo machines,” meaning they replicate the gender biases that exist in society today, she said.
What’s more, “a small number of very large corporations … have basically gone out and sort of stolen everybody’s data, stuffed it into these giant models, and are currently competing in a kind of AI arms race for compute power,” D’Ignazio said.
Among the many impacts of this is that financially-driven capitalist companies tend to only develop technology for certain groups of people, she said. That means other groups will be marginalized.
“The current way that technology development is proceeding is extremely non-democratic,” D’Ignazio said.
“I think it’s really no accident that the same logics that have produced the AI that we are experiencing are the same logics that are currently in operation in the U.S. that are tending toward authoritarianism and fascism and eugenics,” she said. “We need to note these connections and we need to think about how do we … reorganize the entire process of development to be more democratic.”
In her lab at MIT, she said scholars are working to create “feminist AI from the ground up, which really means reconsidering the whole organization of the process.”
Another panelist, Margaret Mitchell said the problem of underrepresented groups in AI is linked to “colonialism.” Michell is the chief ethics scientist and researcher at Hugging Face, which runs a collaborative AI platform for models, datasets, and applications.
“The issue of why can’t we have lots of people at the table making their own decisions for AI for everyone is, in part, because these communities are often already subject to colonialism dating back hundreds of years where global north and white people are dominating people in the global south,” she said.
Representing everyone well in AI technology is “tricky” because these groups are still “grappling with hundreds of years of oppression,” Mitchell said.
A lot could be done to create better accessibility for people with disabilities and the elderly through AI, but these people tend to be an “afterthought” for large corporations, she said.
But some computer scientists are. Recently, one group used open source code to create a program to develop more accessible sidewalks in Seattle for people with disabilities, she said.
D’Ignazio gave another example, saying AI could be used to help tenants secure housing. Tech companies have developed a number of new programs to help landlords review credit histories, screen potential tenants, and manage properties, but she said resources for renters are lacking.
Gender plays a part in this, too, because more women tend to be renters and single women of color tend to be most harmed by evictions, she said.
Tech development tends to start with the “median human … which is always a white man, middle-aged body, cisgender,” D’Ignazio said.
However, D’Ignazio said she does have hope for the future.
“With all the attacks on DEI, the whitewashing of U.S. history, the antisemitic comments and Nazi salutes at the inauguration, all of this crazy stuff, it’s designed to spread fear, and it’s designed to silence us, and it’s designed to make us shift what we are doing to other directions for various reasons,” including funding, she said.
Yet, the Trump administration and other powerful forces have not completely taken over, she said, referring to the recent cuts to DEI-linked research funding.
“For example, I just got [a National Science Foundation] project funded just this week, and it has so many of the censored words in it. I was like, ‘Oh, this is cool, because this means the takeover is not complete.’
“One of my messages here would be: Stay the course,” D’Ignazio said. “There are so many of us out here who do care about equality, who care about democracy, and I include lots of men and lots of white men in that statement.”
MORE: Trump admin scraps university library grants on BIPOC, LGBTQ reading habits
IMAGE CAPTION AND CREDIT: Harvard computer science Professor Fernanda Viégas speaks during a ‘Gender and AI’ conference. Harvard Radcliffe Institute
Like The College Fix on Facebook / Follow us on Twitter