A personal reflection on big data ethics in education
By Dave Goodrich, Instructional Designer
I’ve been wrestling with questions related to big data ethics recently:
- How is student data being used in education that could potentially impact students in negative ways?
- In what ways are student data being accidentally used insensitively, negligently and/or without awareness of unintended consequences?
- In what way could the use of predictive learning analytics be used to underscore “stereotype threat” (Steele, et al. 1995) or the “Pygmalion Effect” (Rosenthal, 1973) that can serve self-fulfilling prophecy?
This brief post has a few purposes. First, I want to take a moment to reflect out loud. Second, I want to share some of these questions I’ve been wrestling with and invite you to engage in conversation. Lastly, I have some thoughts I’d like to share regarding how we could potentially move forward on this important topic.
Allow me to take a moment to begin by reflecting.
Big Data? Big Questions for Educational Institutions
We were lucky enough to have Dr. Chris Gilliard (@hypervisible) come to the Hub and raise some of his own concerns on similar such questions a few weeks ago. He came here as a guest speaker for a faculty learning community on the topic of learning analytics. I first learned about Chris’ work on these topics when listening to a large featured speaker lightning talk he gave on “Redlining to Digital Redlining” at the OLC Innovate conference in New Orleans a few years back. It was one of the most challenging and thought provoking talks I experienced that week. Gilliard continues to help me think through these things in person, in his writing on his blog, Educause, and elsewhere.
There are many conversations like these happening and a growing body of scholarly literature on the topic. It is worth paying attention to and lessons being learned by a vast community of thoughtful educators who share these concerns. Take, for instance, the ones reported on in 2014 by Anya Kamenetz on NPR regarding student experiences at Purdue. There are a plethora of principles and practices recommended on these things such as the ones shared in “Ten Simple Rules for Responsible Big Data Research” (Zook, et al. 2017).
I’m just not so sure that simple is the most accurate word choice.
Personal Experiences and Questions
Anyone who knows me or has interacted with me is aware of my open approach to sharing learning and growth as a professional and as a person who is far from perfect. If you haven’t, I’d like to share some insights from my own practices. To the joy of sharing life details with those in my social network, to the chagrin of doing the same, I haven’t been shy in, at times, bearing my soul. This is in part because of a fundamental belief about learning that it happens best, for me, when it happens out in the open. I realize that there are people who this frightens or who take a far more careful approach to what the do and do not share. Over the years, I’ve been somewhat perplexed by this posture. I’ve debated back and forth about the ills of “big brother” or corporate greed coming back to haunt us one day. Sometimes, I’ve ironically had these debates over gmail exchanges.
At times, I’ve taken the approach of brushing fears off as paranoia. I’d ask, if you don’t have anything to hide, then why should you be overly cautionary about what data you share with the world? I’ve also explained that the unintended use of our data is inevitable like in the recent situation with Experian scandal. Why not just share more data like Hasan Elahi did after being added to an FBI watchlist by accident?
But, in 2014, my viewpoints were challenged in part by watching this TED talk by Glenn Greenwald about why privacy matters. I still wrestle with these questions, especially in light of the devastating loss of Aaron Swartz or the similarly controversial actions of Edward Snowden in recent years. In an ideal world, I would wish the following basic principles were upheld by all institutions.
My Recommended Guiding Principles for Data Governance
- It should be clear and transparent what is being done with someone’s data.
- People should ultimately get to have a final say on what data is shared, who it is shared with and what can be done with it.
If you agree with these principles, how will we realize a future where these postures toward data are upheld? If you disagree with these recommendations, how would you modify them? How do you go about being thoughtful in your use of data, either your own or the data of others you might have access to? Finally, there are emerging leaders in these practices by institutions who are growing to recognize the importance of respecting people’s rights to privacy. Which institutions do you see doing a good job of responsible data stewardship?
Rosenthal, R. (1973). The Pygmalion Effect Lives. Psychology today.
Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology, 69(5), 797-811.
Zook, M., Barocas, S., Crawford, K., Keller, E., Gangadharan, S. P., Goodman, A., … & Nelson, A. (2017). Ten simple rules for responsible big data research. PLoS Computational Biology, 13(3).