Understanding and Managing the Risks of Analytics

min read
Policy Matters

Randall J. Stiles is Special Advisor for the President, Business Analytics, at Colorado College. Jill Tiefenthaler is President of Colorado College and Professor of Economics.

Two recently published books—Why Does College Cost So Much? and The Innovative University: Changing the DNA of Higher Education from the Inside Out—describe some key factors affecting decision-making in higher education today: (1) increases in the value of higher education to economic competitiveness and upward mobility; (2) sustained increases in the cost of higher education; (3) growing scrutiny and accountability (by accrediting agencies and the government); and (4) the transformative "disruptive innovation" forces of online learning and outcomes-based assessment.1 In response, many higher education leaders are exploring the relatively new waters of "academic analytics" and "learning analytics." These two terms correspond to decisions related to the business of the college or university and to teaching and learning, respectively. Making better, data-informed decisions, improving performance, and becoming less reliant on "gut instinct" regarding critical issues facing the institution or the quality of instruction are all worthy pursuits. However, investing institutional resources in analytics is not without risks, so a fair-minded analytical thinker should consider how best to manage those risks. Because of the close connections between the practice of risk management and matters of governance and compliance, institutions increasingly do this work within a governance/risk-management/compliance (GRC) framework.

In Analytics at Work: Smarter Decisions, Better Results, the authors make a case for analytics, but they add the following cautions:

  • There are some instances when the use of analytics doesn't apply.
  • There are times when the use of analytics is not practical.
  • There are times when decisions informed by analytics need scrutiny.
  • Ultimately, even when the use of analytics does apply, the best decisions will be made by those who "combine the science of quantitative analysis with the art of sound reasoning."2

With these cautions in mind, we will highlight five risk areas that leaders might address in an analytics initiative: (1) data and information quality, (2) data and information compliance, (3) data and information governance (including cases where third-party services are involved), (4) inappropriate or premature use of analytics, and (5) countercultural impact (pushing too hard and too fast with analytics initiatives).

Data and Information Quality Risk

"You can't be analytical without data, and you can't be really good at analytics without really good data."3 Decision-makers need data and information (meaningful patterns of data) that communicate and promote an understanding of the complex. Stephen Few and Edward Tufte provide very helpful ideas about the art and science of data visualization, including how to identify patterns and how to make meaning from data.4 Data and information quality risk can be mitigated by identifying data stewards and giving them responsibility for (1) developing an inventory of institutional data and information, (2) ensuring that there are clear definitions and quality standards for all data and information, and (3) establishing and exercising a data and information quality review and improvement process, targeting those data and information elements that matter most.

Data and Information Compliance Risk

Compliance means conforming to the requirements of an authorized and recognized external agent—usually one associated with a law (state, federal, or international) or contract. Failure to comply can lead to an adverse result such as financial penalty, additional work, or even personal liability and imprisonment for institutional officers. Even though the data and information privacy and security compliance requirements of federal laws such as the Family Educational Rights and Privacy Act (FERPA), the Gramm-Leach-Bliley Act (GLB), and the Health Insurance Portability and Accountability Act (HIPAA) are complex and sometimes confusing, investing in compliance work is likely to reduce the risks of analytics because those investments will increase data and information awareness, quality, and protection. Those in governance roles must decide how to allocate resources for compliance to achieve an acceptable level of risk.

Data and Information Governance Risk

Implicit in the comments about compliance is the notion of control—a governance issue. To ensure data and information privacy, security, quality, and auditability, data and information must be carefully controlled. Governance is primarily manifested in written policy documents. And it is through the utilization of risk-management principles and processes that appropriate levels of control can be realized. The increasing use of cloud services and software-as-a-service in higher education is generating new governance challenges. Ultimately, the data-owning organization cannot abrogate responsibility for data protection. Also, the data and information envisioned for use in learning analytics present new ethical questions for faculty and staff: Should students be able to opt-out of having their learning analytics data collected? What is the recourse for any individual who has had his or her data misused or inappropriately shared? Who owns the data mined from a learning process?

Inappropriate or Premature Use of Analytics Risk

Sometimes the tools and methods of analytics are not practical, and decisions informed by analytics need scrutiny. For example, analytics will not likely be helpful for decision-makers (1) when there is no time for gathering, processing, and interpreting data, (2) when there is no history or precedent related to the decision or when historical data may be misleading, (3) when the decision-makers have legitimate expert experience and intuition, and (4) when key variables can't be measured or have very high degrees of uncertainty.5

A second caution under this heading relates to the difficulty of the measurement. For example, after more than a decade of work on assessment, we know that although measuring the quality of learning and teaching is important, it is also quite complex. Research productivity has more quantifiable metrics. Just because something can be measured easily does not mean that it is more important or should get more attention than something that is difficult to measure.

Finally, an institution may not be ready for effective analytics work. A recent Harvard Business Review article describes four problems that prevent organizations from realizing better returns on their investments in "big data" and analytics:

  • Analytics skills are concentrated in too few employees.
  • IT needs to spend more time on the "I" and less on the "T."
  • Reliable information exists, but it's hard to locate.
  • Business executives don't manage information as well as they manage talent, capital, and brand.6

Countercultural Impact Risk

In a November 2011 study of analytics work, the MIT Sloan Management Review and the IBM Institute for Business Value highlighted the importance of a "data-oriented culture: a pattern of behaviors and practices by a group of people who share a belief that having, understanding and using certain kinds of data and information plays a critical role in the success of their organization."7 Imposing analytics initiatives in an organizational culture that is not data-oriented can pose a significant risk to leaders.  The authors of the Harvard Business Review article provided additional insights regarding this issue when they surveyed and evaluated 5,000 employees in 22 global companies based on the employees' decision-making style: "unquestioning empiricists," who trust analysis over judgment; "visceral decision makers," who rely exclusively on intuition; and "informed skeptics," who "effectively balance judgment and analysis, possess strong analytics skills, and listen to others' opinions but are willing to dissent." The informed skeptics are "best equipped to make good decisions," yet just 38 percent of employees and 50 percent of senior managers fell into this category.8 The implication for higher education leaders is that plans for analytics initiatives should include an assessment of the organizational decision-making style and the degree to which organizational culture is data-oriented.

Conclusion

Under the right circumstances, decision-making can be enhanced by the tools and techniques of analytics. Large data sets, analytics engines, and new data-visualization techniques have considerable potential to enhance both student learning and institutional business intelligence. Clearly, besides the risks outlined above, there is also a risk involved in saying "no" or "not now" to analytics work. In doing so, an institution can fall behind its peers and might miss the opportunity to make better decisions and get better results. Our advice is to be mindful not only of the risks of investing in analytics but also of the risks of missing the benefits that analytics has to offer higher education institutions.

 

Notes
  1. Robert B. Archibald and David H. Feldman, Why Does College Cost So Much? (New York: Oxford University Press, 2011); Clayton M. Christensen and Henry J. Eyring, The Innovative University: Changing the DNA of Higher Education from the Inside Out (San Francisco: Jossey-Bass, 2011).
  2. Thomas H. Davenport, Jeanne G. Harris, and Robert Morison, Analytics at Work: Smarter Decisions, Better Results (Boston: Harvard Business School Press, 2010), p. 15.
  3. Ibid., p. 23.
  4. Stephen Few, Now You See It: Simple Visualization Techniques for Quantitative Analysis (Oakland, Calif.: Analytics Press, 2009); Edward R. Tufte, The Visual Display of Quantitative Information, 2d ed. (Cheshire, Conn.: Graphics Press, 2001).
  5. Davenport, Harris, and Morison, Analytics at Work.
  6. Shvetank Shah, Andrew Horne, and Jaime Capellá, "Good Data Won't Guarantee Good Decisions," Harvard Business Review, April 2012.
  7. David Kiron, Rebecca Shockley, Nina Kruschwitz, Glenn Finch, and Michael Haydock, "Analytics: The Widening Divide," MIT Sloan Management Review, research report (November 7, 2011), p. 11.
  8. Shah, Horne, and Capellá, "Good Data Won't Guarantee Good Decisions."

EDUCAUSE Review, vol. 47, no. 4 (July/August 2012)