ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Artificial intelligence image generators have developed a reputation for bias. For instance, Bloomberg reported that when prompted to generate images of different professions, the AI model Stable Diffusion mainly created White men, particularly in higher-paying occupations.
Valeria Stepanova, a professor of chemistry at Viterbo University, wanted to know if this observation applied to her scientific discipline. “I’ve always been interested in how diversity is represented in my field,” Stepanova says.
So she and her colleagues tested four AI image generators—Adobe Firefly, DALL-E 2, Craiyon, and DreamStudio—providing each with carefully engineered prompts pertaining to chemistry-specific occupations. The researchers then classified the gender and race of 200 AI-generated chemists, giving each an assignment of male or female and identifying each as African American, American Indian, Asian and Pacific Islander, Hispanic, or White. The researchers scanned every image for visible signs of disability.
Stepanova and her team found that, together, the AI models generated three times as many images of male chemists as female chemists. Most of the images showcased White chemists, and none depicted anyone with visible disabilities (J. Chem. Educ. 2024, DOI: 10.1021/acs.jchemed.4c00249).
“What you’re getting is this really skewed perception” of a chemist, Stepanova says. “You have to use a really specific prompt if you want to represent a certain category of people. Otherwise, they just disappear.”
The researchers also found that some AI models generated higher percentages of women when the word “assistant” was added to the prompt. For example, women appeared in over 80% of images when DreamStudio was asked to display “chemistry laboratory assistants.”
Seeing such stereotypical images of chemists could impact the ability of women, people with disabilities, and marginalized racial and ethnic groups to picture themselves in chemistry, Stepanova says. “As they see more images out there that don’t look like them—thousands of images, millions of images—how do you find any sense of belonging?”.
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X