Ethical Discourse: Guiding the Future of Learning Analytics

min read

Learning analytics holds increasing potential for student agency and autonomy, highlighting a need for ethical discourse at all levels of higher education institutions. Topics central to this dialogue include student awareness of analytics, the future of algorithms and learning analytics, and the redefinition of failure.

article artwork

James E. Willis III is an educational assessment specialist and Matthew D. Pistilli is a research scientist at Purdue University.

In a previous article published by EDUCAUSE Review Online, "Ethics, Big Data, and Analytics: A Model for Application," we, along with John P. Campbell, argued for a Potter's Box framework to address pressing ethical concerns in learning analytics.1 As a follow-up to the central concerns raised in that article, we delivered an interactive presentation, "Ethics and Analytics: The Limits of Knowledge and a Horizon of Opportunity,"2 at the EDUCAUSE 2013 Annual Conference. That presentation was guided by four major questions:

  1. What are the current projects going on in learning analytics today? What are the potential ethical pitfalls that surround these developments? Why are they potentially harmful? Are these things always wrong, or are they contextually wrong?
  2. What is the role of "knowing" a predictive analytic — once something is known, what are the ethical ramifications of action or inaction? What are the roles of student autonomy, information confidentiality, and predictive modeling in terms of ethical development of new systems, software, and analytics?
  3. How might we affect the development of future analytics systems by having ethical discussions? What are the possible inventions and innovations that could come from these discussions?
  4. What are the frontiers of ethical discussions and technology? Are there new frameworks? How can today's discussions affect the future?

What emerged were a series of concerns that helped shape the thesis of preparing for the future of learning analytics with ethical dialogue,3 particularly as learning analytics' computing power becomes more sophisticated and ubiquitous. Here, we discuss session participants' ideas, which are helping shape the future of learning analytics research and innovation. Among the questions raised in our conference session were: What do the ethical models look like? How are these models deployed rapidly — at the speed of technology? How are these models refined with time?

We distilled the group discussions into a series of topics, including student awareness (or lack of awareness) of analytics, future algorithmic science, and the future of learning analytics as defined by business practices, student and faculty access to the data, and a redefinition of failure.

The arguments put forward here often take the form of rhetorical questions; the methodological purpose in presenting the argument in this way is to frame how ethical questioning might guide future developments.

Student Awareness of Analytics

Today's college students are often digitally connected and often own multiple Internet-ready devices.4 From identifiable Internet histories to semantic analysis to databases full of demographic information, today's students also stand at the juxtaposition of powerful regression analyses that can predict success with startling accuracy5 and the tutelage of administrators who are politically pressured to secure high retention and graduation rates amidst declining budgets. Are students even aware of what is known about them — whether "known" is defined as disparate pieces of data or in powerful analytic systems poised to offer them help? Better yet, should students have input regarding what data is stored and how it is used? As a means to an end, is it unethical for administrators to do whatever possible to help ensure student success, even if it means stretching the meaning of privacy?

Broadly speaking, student information is still kept private; at the granular level, it is kept confidential. The Family Educational Rights and Privacy ACT (FERPA) allows for the use of data on a need-to-know basis, and then defines the parties who are privy to the information and for what purposes. Perhaps the tension here is that students are paying the institution for an education, and institutions in return provide said education. Is it unethical for an institution not to readily offer support when it can identify students who might benefit from various resources?

Disclosures of what data are stored and how they are used has become troublesome in technology in general, not just in higher education. The nearly incomprehensible legal jargon employed to protect software firms and brokers is difficult to understand at best and potentially litigious at worst. A question common during the EDUCAUSE 2013 presentation — and in many discussions since — includes the extent to which students should be notified of what is going on with their data in a way that echoes the legal statements requiring compliance. Furthermore, some institutions remain unprepared to manage data or analytics systems. Perhaps the fear is that administrative bodies will bury their disclosures to avoid explaining to students the fate of their data.

The ethical dimensions of selecting data points, whether in aggregate or specific students, should come under the scrutiny of disclosure, due diligence, and good faith. That is, students should be made aware, in plain language, of how their data is being used, how it is being protected, and what the possible outcomes might be.6 It also means that, despite the appealing possibility of original research in learning analytics and despite budget-conscious administrators, at all times of development the question of what should be done versus what can be done ought to serve as a prescient warning.

The Future of Algorithms

One of the effects of computing so-called big data is the exponential growth and complexity of statistical modeling, including the regression analysis often employed in learning analytics.7 Will future algorithms, massaged with sophisticated data, lead to a far more powerful way of predicting student success? What approaches unforeseen at this point could perhaps replace regression models? Because the emergence of new data patterns might well create entirely new methods. The question of what this data really tells us will remain. Will the data affect student motivation in any quantifiable way? Can statistical models lead to better outcomes with student motivation? What measures can be developed to assess the changes affected through the use of predictive algorithms?

A discussion perpetually arising at the intersection of student motivation and learning analytics is that of the self-fulfilling prophecy.8 That is, when students receive feedback indicating that they need to improve their performance to achieve a satisfactory outcome, they might assume that it is already a fruitless labor and give up on the course rather than act constructively to better their situation. So, instead of being a constructive tool, feedback becomes a prophet of failure. Will future highly sophisticated and validated algorithms only intensify the inverse of the intended outcome? Administrators might wrongly assume that any and all feedback is beneficial to the student; if this assumption is, indeed, false, then who decides what feedback is valid and how often it should be delivered?

Predictive analytics have a dark side, too. If we know students truly stand at the threshold of shortcoming, are we ethically responsible to prevent academic failure? As measurements of lifestyle choices — such as how often students go to the gym, seek out tutoring, or play video games — become more pervasive and quantifiable, the question of failure prediction might sharpen acutely. At what point are college students to be treated as independent agents who are, as adults, responsible for their own successes and failures? More importantly, who gets to decide where that threshold is? The question of in loco parentis lingers large in these questions because, although they are answerable, who has the authority to provide guidance is unclear. In other words, who determines what constitutes a successful outcome in a student's career?

The Future of Learning Analytics

From its inception, learning analytics developers took their cues from business intelligence and analytics systems.9 The journey of a struggling student who is able to succeed with feedback from a faculty member, facilitated by an analytics system, could be cast in the same light as the individual who similarly journeys from potential customer to repeat patron thanks to targeted marketing and coupons. The question here, though, is how much learning analytics should continue to take cues from the business world? Are the systems now divergently different, or do they share potentially symbiotic developments? Is it appropriate to glean what is possible from business analytics, especially as increasing amounts of data become available for research into human motivations? If the purpose is to help struggling students, does this justify the possibility of using algorithms like those typically aimed at customers?

Perhaps central to questions about using business analytics is how educators view data. Is there an inherent difference in how a business analyst and an educational analyst view data sets? It is probably fair to extrapolate a key difference: portability. Whereas business intelligence is often held in the strictest confidence, educational institutions have great potential for sharing de-identified, aggregate data to further analytics development. Even more difficult, yet potentially valuable, is the possibility of individualized information that might become portable and follow students from a young age through their advanced education; this might well help us identify individual metrics, such as motivators for success.

Redefining Failure

Higher education's proliferation in the past few decades has led commentators to question a college degree's value.10 Arguably, learning analytics provide, with increasing breadth and depth, a science to measure and implement student success initiatives.11 The ethical question, then, becomes one of failure in relation to paternalism: what constitutes failure — a poor quiz grade or a low GPA? Who determines what failure is, a professor or an administrator? How long should a school allow students to fail before they intervene? And, if schools intervene once failure has been identified, how much help is offered, and by whom?

There is much to be learned in failure. Few people have achieved a college degree without some modicum of failure; with a troubling grade or lack of understanding comes additional effort to compensate for the difference. Student success has been at the forefront of learning analytics development, but it may soon emerge that some of the best data sources come from student failure. Far from hampering future development, the encouragement to learn from failure or encouragement of resilience could provide potent motivators for analytics systems.

Conclusion

The ethical questions involved in learning analytics development and implementation are multifaceted, difficult, and involve numerous stakeholders in the educational process. Although learning analytics might provide a pathway to efficiently helping students, they also involve critical decisions with far-reaching consequences. The most pertinent questions focus on the topics discussed here; from these, it is possible to generate questions that help us determine what really helps students and how we might efficiently implement interventions.

Acknowledgments

We thank the following conference contributors, whose ideas informed our work here: Jean-Paul Behrnes, Jeffrey Belliston, Shaun Boyd, Aaron Coburn, Jody Couch, Zach Heath, Katie Himmelrick-Bruce, Amy Irvin, Shiro Kashimura, Sander Latour, Hans Peter L'Orange, Karen Meelker, Jack Neill, Joyce Nijkamp, Phil O'Hara, Paul Schantz, Tyler Schlagel, Marianne Schroeder, Mike Sharkey, Randy Stiles, John Tong, and James Williamson.

Notes
  1. James E. Willis III, Matthew D. Pistilli, and John P. Campbell, "Ethics, Big Data, and Analytics: A Model for Application," EDUCAUSE Review Online, 6 May 2013.
  2. James E. Willis III and Matthew D. Pistilli, "Ethics and Analytics: The Limits of Knowledge and a Horizon of Opportunity," presentation, 2013 EDUCAUSE Annual Conference, 17 October 2013.
  3. The outcomes from this "flipped" presentation included a series of group notes that grapple with these questions. Although not everything discussed is included in this article, it is worth expanding on some of the key ideas with the express intent of continuing the dialogue on the role of ethics within learning analytics.
  4. Eszter Hargittai, "Digital Na(t)ives? Variation in Internet Skills and Uses among Members of the 'Net Generation,'" Sociological Inquiry, vol. 80, no. 1 (2010), pp. 92–113; and Sue Bennett, Karl Maton, and Lisa Kervin, "The 'Digital Natives' Debate: A Critical Review of the Evidence," British Journal of Educational Technology, vol. 39, no. 5, 2008.
  5. Alfred Essa and Ayad Hanan, "Student Success System: Risk Analytics and Data Visualization Using Ensembles of Predictive Models," Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, ACM, 2012, pp. 158–161.
  6. Fred Stutzman, Robert Capra, and Jamila Thompson, "Factors Mediating Disclosure in Social Network Sites," Computers in Human Behavior, vol. 27, no. 1 (2011), pp. 590–598.
  7. Vicente-Arturo Romero-Zaldivar, Daniel Burgos, Abelardo Pardo, Daniel Burgos, and Carlos Delgado Kloos, "Monitoring Student Progress Using Virtual Appliances: A Case Study," Computers & Education, vol. 58, no. 4 (2012), pp. 1058–1067.
  8. Beth Dietz-Uhler and Janet Hurn, "Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective," Journal of Interactive Online Learning, vol. 12, no. 1 (2013); and Barton Keller Pursel, "Learning Analytics: Tread Carefully" [http://www.psu.edu/dept/site/2012/04/learning-analytics-tread-carefully.html], Schreyer Institute for Teaching Excellence: Thoughts on Teaching and Learning at Penn State, 10 April 2012.
  9. George Siemens and Phil Long, "Penetrating the Fog: Analytics in Learning and Education," EDUCAUSE Review, vol. 46, no. 5 (2011), pp. 30–32.
  10. James Rosenbaum, Kennan Cepa, and Janet Rosenbaum, "Beyond the One-Size-Fits-All College Degree," Contexts, vol. 12, no. 1 (2013), pp. 48–52; and Min Zhan and Michael Sherraden, "Assets and Liabilities, Educational Expectations, and Children's College Degree Attainment," Children and Youth Services Review, vol. 33, no. 6 (2011), pp. 846–854.
  11. Kimberly Arnold, "Signals: Applying Academic Analytics," EDUCAUSE Quarterly, vol. 33, no. 1 (2010); and Paul Baepler and Cynthia James Murdoch, "Academic Analytics and Data Mining in Higher Education," International Journal for the Scholarship of Teaching and Learning, vol. 2, no. 2 (2010).