Computer science Professor Colleen Lewis is something of a social activist superhero whose arch enemy is implicit bias in her field. Though she and others like her have made progress, stereotypes of the male computer “nerd” or its obnoxious younger brother the “brogrammer” are still prevalent. Lewis challenges her students to recognize and combat these stereotypes through increased awareness.
“One of the things I think people should learn is what a microaggression is,” says Lewis, referring to a concept that is sometimes also described as “unintended discrimination.” Microaggressions are the use of common social behaviors and expressions that, even when used without conscious malicious intent, can have the same impact as intentional discrimination. Lewis sees understanding this concept as key to her success as a professor and to her students’ success in their careers. “We each have our own sphere of influence, right?” she asks. “So you’re going to go out into the world and have a sphere of influence, and how are you going to make positive change in that space? It’s my responsibility to understand how my implicit bias shapes students’ experience.”
I’m excited to empower Mudders to change the culture of CS and to fight for social justice.
– Colleen Lewis, assistant professor of computer science
One way Lewis integrates social justice topics into her computer science curriculum is by incorporating the use of a Bechdel test. Named for American cartoonist Alison Bechdel, the test originated as the idea of one of Bechdel’s comic strip characters, who declares that she only sees a movie if it meets the following requirements: It has to have at least two female characters, the two characters must talk to each other, and the characters’ conversations must be about something other than men. Students working on graph problems use a movie script to build a graph that describes who talks to whom. “They are then able to run statistics on the graph, and one of the things they can test is the Bechdel test,” Lewis explains. “This is a way that they’re still just learning the exact same graphalgorithm content, but it’s imbedding these broader ideas.”
Another way Lewis integrates teaching students how to understand implicit bias is in the software engineering class where students work in teams and evaluate each other. “We talk really explicitly about the ways in which our evaluations of people are not shaped by pure metrics,” Lewis says, “they’re shaped by a perception of the contributions this type of person might make. We’re surrounded by stereotypes, and students need to understand how those stereotypes can shape their evaluations, even though those evaluations might feel unbiased.”
If all this sounds more like behavioral science than computer science, perhaps that’s because considering topics like racial equity and feminism has traditionally been the job of humanities professors while the STEM professors focused more on STEM. Lewis believes that there is as much opportunity to address these topics through science as there is through art or literature. By understanding the mechanics of implicit bias and showing her students how to do the same, she has contributed to Harvey Mudd’s intentional process of altering the culture of computer science to remove cultural and structural barriers that discourage participation. “I’m excited to empower Mudders to change the culture of CS and to fight for social justice” she says.