Yes, San Francisco is a discourse of unnatural knowledge development, but it’s also one of the queerest cities in America. The Mission District, where ChatGPT manufacturer OpenAI is headquartered, bellies up against the Castro, where street crossings are coated with colors, and older skinny people are often seen milling around.
And gay people are joining the Artificial trend. Spencer Kaplan, an archaeologist and PhD student at Yale who relocated to San Francisco to study the engineers who create conceptual resources, says that” some people in this area are queer men, which is something I think some people talk about.” Sam Altman, the CEO of OpenAI, is queer, he married his father last year in a personal, seaside ceremony. More Gay people are now working on AI jobs and connecting with groups like Queer in AI, both in California and above.
Founded in 2017 at a top scientific conference, a core aspect of Queer in AI’s mission is to help LGBTQ researchers and scientists who have previously been silenced, especially transgender people, intersex people, and people of color. ” Queer in AI, honestly, is the reason I did n’t drop out”, says Anaelia Ovalle, a PhD candidate at UCLA who researches algorithmic fairness.
However, there is a difference between the queer individuals who are interested in artificial intelligence and how the same group of individuals is represented by the tools their industry is developing. When I asked the best AI video and image creators to depict queer people, they all responded with stereotypical depictions of LGBTQ culture.
Despite recent improvements in image quality, AI- generated images frequently presented a simplistic, whitewashed version of queer life. I used Midjourney, another AI tool, to create portraits of LGBTQ people, and the results amplified commonly held stereotypes. Lesbian women are depicted wearing nose rings and stern expressions. All of the men in fashion have killer abs. Basic images of trans women are hypersexualized, with lingerie outfits and cleavage- focused camera angles.
How humans are depicted in image generators reflects the information used to train the machine learning algorithms underlying them. This information is primarily extracted by scraping text and images from the internet, where queer person depictions may already reinforce stereotypical stereotypes, such as gay men appearing sexy and lesbian women appearing butch. Users might encounter issues that expose similar biases when using AI to create images of other minority groups.