Data can be incredibly useful. It’s helped us track COVID, create vaccines, and gauge our safety as vaccination rates rise. It can help us choose sustainable foods, predict extreme weather, or analyze our own spending habits. If we use dating apps, it can even help us find a life partner.
But data can also be dangerous. It can be collected without our knowledge or consent. It can be interpreted without our input or context. It can be used to sort us into “types” that then determine what internet search results we see, what social and professional networks we are part of, and whether we qualify for loans or are seen as potential criminals.
As a school, we have wisely used quantitative and qualitative data to re-evaluate APs, rethink academic tracking, and craft homework policies in all three divisions. We have actively sought data in the form of course evaluations and class surveys, giving students a say in the data they provide.
And whether we are comfortable with the terms "data" and "surveillance" or not, each one of us relies on forms of both in our daily work with students. We keep track of attendance, we average grades, and we routinely evaluate body language to figure out if a student is in distress or if our lessons are falling flat.
The past 13 months have invited a steep rise in our use of technology that gathers data and can be used for surveillance. This summer, we’ll adopt a new Learning Management System that offers even more. And right now, as we near the year’s end, we are relying on data and forms of surveillance to make judgements that make it into student reports and records, which then serve as their own data sets.
So how do we teachers distinguish productive, student-serving use from harmful abuse?
1. Recognize how much "surveillance" and data collection we teachers already do
In person, we scan for dress-code compliance, phone activity, and behaviors we read through our own cultural lenses as engagement, respect, and commitment. In Zoom, we scan backgrounds, choices to mute or hide video, display names, chat participation, and behaviors we read as engagement, respect, and commitment. In our myriad other technologies such as Academic Manager, Google Classroom, and email, we note timestamps, response time, grammar choices, and email signatures. And our data gathering often comes with quick judgements, whether we notice them or act on them or not.
"We are encouraged to see data as descriptive, not just indicative. And when that happens, a surfeit of data erects a barrier between students, teachers, and administrators. —Sean Michael Morris, UC Denver's School of Education and Human Development
2. Ask who has become visible, hypervisible, or invisible
We want every child to feel seen, valued, and supported. But, are there students we over-watch? Kids whose assignments we check first assuming they won’t be in? Are there others we forget about—a lone virtual student or one who doesn’t participate much but turns in thorough work? What dimensions of visibility have we lost in the absence of live sports, robotics, debates, and artistic performances? What dimensions of visibility have we gained by entering students’ homes or family lives? Who bears the paradoxical burden of being both hypervisible and invisible—those who are constantly surveilled and never fully known as a result?
So naturalized are the Google maps and bar charts generated from spreadsheets that they pass as unquestioned representations of “what is.” This...needs to be subjected to a radical critique to return the humanistic tenets of constructed-ness and interpretation to the fore. —Johanna Drucker, Department of Information Studies, UCLA
3. Use data to generate questions, not answers
Has a student missed the last five drop-box assignments? Is the timestamp on her last email 2:38am? Such data may prompt us to think that she isn’t committed to academics. But what if the student has taken a job to supplement family income? What if she has given her laptop to younger siblings learning online? What if she is masking deep depression in the wake of political upheaval, the pandemic, and violence against people of color? If we approach the student with genuine inquiry, care, and options, she may choose to share (or not), giving us the opportunity to have used our data well and to make sure we avoid misinterpretation in our narrative comments.
We can also use student data to ask questions about ourselves. A high percentage of "video-off" Zoomers or collectively low scores on an assessment may give rise to thinking about what we can do to engage learners, alter screen-heavy schedules, build trust, or provide greater support.
“What do 'free will' and 'autonomy' mean in a world in which algorithms are tracking, predicting, and persuading us at every turn”?—Ruha Benjamin, African American Studies Professor at Princeton University
4. Recognize that data can provide a (limited) window to the past, not a prediction of the future
The rings in a tree’s trunk can tell us a lot about the tree’s age and the climate, fires, and insect infestations of its past, but it cannot say what next year will bring. Similarly, transcripts, attendance records, and this term’s grades tell only a little about what a students have done, not what they will or could do once the pandemic is over or our anti-bias work is more effective or their siblings head back to school or their parents stop fighting.
Got 5 minutes? Before writing reports next month, reflect on our data gathering and surveillance. What assumptions are we making about attendance, assignment timestamps, responsiveness, and body language? In the next few weeks, how can we better understand our students, and therefore better evaluate and make suggestions for their growth by the year’s end?
Got a whole class or class period? Investigate the “types” you and your students have been assigned by apps like Netflix or Spotify or search engines like Google. What have algorithms guessed about you based on your viewing, listening, or searching habits? What Google searches yield vastly different results based on those types? In other words, how does data gathered about us alter our access to the digital world?
Or, consider adapting this New York Times Learning Network math lesson to help students consider how data can be misinterpreted. Then ask students to share what data we collect at Chapin they feel may be misinterpreted. Or, share this article about requirements to have Zoom cameras on, inviting students to weigh in both on the article itself and about other forms of inadvertent surveillance at Chapin or in their lives.
Got a whole unit? Investigate the role of data and surveillance in your field. How has it been used in history, literature, science, the arts, physical education, or language instruction? What is assumed about it, what limits are or aren't acknowledged, and what are some of the cautionary tales? (Eugenics, McCarthyism, and accent reduction programs spring to mind.) What do tools like Grammarly suggest about what it means to speak powerfully and what powerful speakers and writers prove the reverse? What is on the mind of experts in your field today when it comes to data and surveillance? How can our students intervene in their fields when it comes to our hunger for quick data analysis and visualization?
Want more?
In under 9 minutes, you can revisit MIT poet-programmer Joy Buolamwini's TED Talk (or watch the 90-minute documentary, Coded Bias, on PBS). Her suggestion to form more diverse teams to fight surveillance bias may lead us to seeking diverse collegial teams to review student work or behavior.
In under 20 minutes you can read “The Surveilled Student” to learn about higher ed’s grappling with some of these same issues.
In under 20 minutes you can also play “spot the data (mis)assumption” by watching this TED Talk that purports to use data to fix global education. What terms does the speaker, Andreas Schiecher, use that may be problematic? How do his data visualizations increase or distort a viewer’s understanding of global education statistics? Where do we fall prey to the same generalizations at Chapin?
Comments