top of page

Decoding Discrimination

As we rely more and more on digital material and virtual interactions, how can we help our students (and ourselves) be more critical of the ways technology replicates racism and other forms of oppression? How can we empower our students to intervene?

...technology is never neutral... Edmond Y. Chang, Drew University Department of English

Quick Google image searches reveal the bias of programmers. Results for terms such as "marriage," "smiling girl," "businessman," or "journalist" expose how the coders' norms are built in to the algorithms we rely on every day. That same bias is built into nearly all technology, as programmers borrow code from one another. So, when black skin isn't picked up by an automatic soap dispenser's optics or when Siri or Alexa--our digital assistants--sound like compliant white women, it isn't terribly surprising. (Meet the voice behind Siri: Susan Bennett.)

I argue that tech fixes often hide, speed up, and even deepen discrimination, while appearing to be neutral or benevolent when compared to the racism of a previous era. Ruha Benjamin, Princeton University Department of African American Studies

Got 5 minutes?

Foster critical thinking about the bias built into sources we rely on, from Google’s algorithms to institutions that confer elite status, such as the New York Times, Pulitzer and Nobel Prizes, or the Metropolitan Museum of Art. In class, ask students to run a Google image search for "mathematician," "historian," "writer," or "geneticist." What does Google think the image of your field looks like? Which museums provide results for search terms such as LGBTQ or gay? Which technologies accept accent marks or non-Western characters? How does tech include or exclude races, genders, sexual orientations, and languages? Even just a five-minute nod to how our materials are filtered through human bias built into digital algorithms can help students become more critical about what's behind the black box of code or institutions they rely on every day. (And, the next time we're looking to add an image to our course pages or slideshows, we can search more directly for diverse representations as well.)


Got a class period?

Investigate Wikipedia's claims of neutral language by reading an entry about something your students have just investigated. How does the semblance of objectivity reveal bias or obfuscate important points? In entries on topics such as "eugenics" or "Jamestown" students can consider passive language, omission or inclusion of certain details, or the selection of sources referenced to understand how subjective all human writing is. Even the choice of images can be illuminating.

For example, of the 22 images featured in the entry on "writer," only 4 focus on females: Anne Frank's signature (under "diarist"), a photo of an unidentified Japanese woman writing hiranga (under "letter writer"), the signature of Maria Anna Alcoforada rumored to be the writer of Letters of a Portuguese Nun (under "pen names"), and Michelangelo's masculine Cumaean Sybil (under "fictional writers"). The others spotlight men—mostly the heavy-hitters of the Western canon: Shakespeare, Blake, Luther, Flaubert, even anti-Semite Wagner as a librettist.


Got a whole unit to spare?

Dig deeper and empower intervention. Students can do both by investigating projects that work to intervene in the harm done by biased algorithms and practices. Mapping Inequality (offering a glimpse into the federal government's redlining practices of the 1930s) and WikiProject Women Scientists (an initiative to encourage more coverage of female scientists in Wikipedia) can provide models for how the scholarship we do in class can work against bias. Students can intervene in algorithmic bias themselves through assignments such as curating collections, displays, or installations of new "search results" to counter Google's; thoughtfully and collaboratively editing Wikipedia (there is no age requirement) to represent the underrepresented; creating a class-annotated bibliography of social media, tech tools, and websites that accept accented characters or non-Western languages; or crafting online exhibits to counter misleading narratives in your field or current unit of study.


Got an idea or resource to share?

Add it in the comments below, please!


Interested in more?

In less than 10 minutes, you can see biased algorithms at work.


Or, if you've got more time to invest, read Ruja Benjamin's Race After Technology or watch her 2016 ISTE keynote address below. She urges us to "incubate a better world in the minds and hearts of our students" by "experimenting with technologies of love, of reciprocity, and of justice."



bottom of page