IT Metrics and Money: One Approach to Public Accountability

min read
Research in Brief
IT Metrics and Money: One Approach To Public Accountability
The California State University system collects data to measure progress toward technology policy goals and reports regularly to the state legislature

Performance measurement can be a difficult political as well as technical challenge for educational institutions at all levels. Performance-based budgeting can raise the stakes still higher by linking resource allocation to a public "report card."

The 23-campus system of the California State University (CSU) accepted each of these accountability challenges beginning in 1999. CSU agreed to institutionalize a comprehensive data-collection process designed to measure progress toward a series of technology policy goals. Further, the state legislature would receive annual reports on those measures of success. In exchange, the legislature agreed to support the technology infrastructure buildout on each of the campuses. The agreement runs through 2008.

This research brief provides an overview of the process and methodology underlying the Measures of Success (MOS) reports. The tools and approaches described here may apply to other public institutions interested in striking a "negotiated accountability" agreement with the state government in exchange for a predictable base of technology funding.

Funding the technology infrastructure of a campus through traditional means (operational budgets) is often uneven and inadequate. Telecommunications pathways, spaces, and media can and perhaps should be treated the same as other forms of physical infrastructure, such as electrical, water, and sewer systems, and funded through capital investment.

The academic and administrative benefits derived from technology depend on a robust telecommunications infrastructure. Therefore, executive management in the CSU system determined that this infrastructure should be given priority—often above new buildings. Voter-approved bonds provided the funding to build the infrastructure.

Before approving the CSU plans to expend capital dollars on technology infrastructure, the state legislature required assurances that having this utility would produce the benefits identified in the system-wide master plan for information technology known as the Integrated Technology Strategy, or ITS. The 10-year time frame of the reporting requirement allows the CSU to show how, over time, as the infrastructure is extended to a growing number of campuses, there is commensurate improvement in ITS goal attainment.

Background

The first MOS report in November 1999 outlined the framework and metrics for success to be used throughout the 10-year period. The November 2000 MOS study presented baseline data against which progress could be measured in subsequent reports.

Data are presented in four major outcome categories:

  • Excellence in Learning and Teaching: The ITS academic initiatives seek to improve academic quality, increase student access, and contain costs.
  • Quality of the Student Experience: The goal of the student services initiative is to use IT to facilitate interactions with the university (communication, admission, registration, scheduling) for students, potential students, parents, and counselors.
  • Administrative Productivity and Quality: The administrative initiatives are to increase the accessibility and utility of major administrative information systems to students, faculty, and staff while improving the efficiency and quality of administrative services. To achieve this, the Common Management Systems (CMS) initiative aims to have all campuses and the Chancellor's Office use common PeopleSoft applications in full production mode by 2007, supported by a consolidated data center.
  • Personal Productivity: The information technology infrastructure initiatives seek to provide each campus with a baseline capability, sufficient in the quantity and quality of computing and network resources, to enhance the personal productivity of individual students, faculty, and staff.

The MOS data collection and reporting process yields information about extensiveness, or the amount of usage of IT services; effectiveness, or the degree to which the ITS objectives are being met; efficiency, or the cost of the services provided; and quality, or the currency and capacity of IT resources and the satisfaction of users.

The academic initiatives expand student and faculty access to teaching and learning resources through collaborative acquisition, development, and distribution of technology-mediated instructional materials. Gains in efficiency made possible by the student services initiatives lower institutional costs for processing admission applications while making services to students much more convenient. The administrative initiatives contribute to containing costs over the long term by streamlining and integrating major campus support operations and automating labor-intensive processes. The infrastructure initiatives are the prerequisite for achieving all of the ITS goals. They will provide each CSU campus with a baseline telecommunications capability and personal productivity resources adequate to maintain institutional quality.

These improvements in access, quality, and affordability are significant. The resulting improvements in productivity are offset, to some extent, by the costs of training, technical support, and periodic hardware and software replacements. Any large-scale economic benefits from the use of IT can only be obtained through efficiencies in the core function of the university's instructional programs.

The four outcome categories of the ITS remain unchanged, but the initiatives to achieve them are dynamic. The academic initiatives continue to evolve in scope and influence. They are expanding the types of learning opportunities available to students, increasing access, and providing significant cost savings in many areas. The administrative initiatives represent the largest enterprise resource planning (ERP) project in American higher education, and their implementations are on schedule and on budget. The campus infrastructure buildout initiatives are realizing steady progress in achieving baseline status for the physical plant, workstation hardware and software, networking, and end-user training and support.

The ITS framework and process were intended to respond to new needs and emerging technologies. The CSU is adding new initiatives to the ITS while retiring others, although new additions will not be part of the MOS reporting process.

Methodology

The CSU has conducted a wide range of data collection efforts to support the MOS process. Both institutional surveys and individual surveys of students, faculty, and staff have been administered over the past several years, and more are scheduled through at least 2008. These include the following:

1. Certain types of campus data often are mandated by law and are collected, synthesized, and published by system-wide offices. In those instances, it makes little sense to collect the data a second time from the campuses. These include official demographic and quantitative records on students, faculty, staff, space and facilities, course enrollments, administrative budgets, and so forth. Where appropriate, these official databases and reporting sources are used in the MOS studies.
2. Other aggregate statistics are collected at the program or department level to monitor and evaluate major system-wide initiatives. These, too, require a relatively modest, informal series of requests to make them available for reports of this nature.
3. An annual institutional survey was initiated to provide technology-related data for internal CSU use. The survey, coordinated by campus CIOs, addresses every important facet of the ITS—academic, administrative, and infrastructure related. For the most part, the items in the survey call for quantitative data on the amount and use of technology resources. Other items ask campus CIOs for informed judgments about the state of technology on their campuses or the roles of user groups.
Campuses were also asked questions concerning institutional policies and practices pertaining to end-user technical support and training. The increasingly distributed nature of technology resources and services on most campuses makes each of these tasks more difficult. However, there is evidence that the standardized and institutionalized nature of the annual campus technology survey is beginning to improve campuses' ability to provide accurate data over time.
4. In addition to institutional data and broad aggregate indicators, it is necessary to gather individual information about student, faculty, and staff experiences with technology. The first student survey in this series (spring 2001) and the second (spring 2003) both closely mirrored the biennial faculty surveys for the obvious purpose of drawing comparisons between student and faculty perceptions and behavior. The sampling methodology is a stratified (by class level and student ethnicity) random sample of approximately 3,200 students drawn from the entire CSU student population of approximately 400,000. The surveys were conducted via telephone interviews by the Social and Behavioral Research Institute at CSU San Marcos.
5. The faculty surveys to date were administered in fall 2000 and spring 2002. Both explored faculty knowledge about, use of, and satisfaction with the full range of instructional technology resources and services on CSU campuses. Only full-time faculty are surveyed. The stratified random sample includes approximately 3,200 faculty selected according to discipline and rank.
6. Staff telephone surveys are based on a sample of 2,200 staff and administrators randomly selected by job classification according to their proportions in the campus community generally. The four stratification classes used for drawing the sample are managerial, professional, clerical/secretarial, and technical. Staff in service, craft, and maintenance occupations are excluded from the sampling design because most either have little exposure to information technology or can not be readily contacted by phone. To date, staff surveys have been conducted in the summers of 2000 and 2002.
The telephone surveys for all three user groups typically include about 100 questions and take only 20–30 minutes to administer due to the efficiencies inherent in a computer-assisted telephone interviewing system. Only system data, weighted for campus size, are provided in the MOS reports; no individual campus data are reported.
7. Since 1990, the annual Campus Computing Project has been the largest continuing study of the role of computers and information technology in American higher education. This national study is conducted by mail each summer and fall through surveys sent to (in most cases) the CIOs on U.S. campuses. Each year, all CSU campuses participate in the study as part of the MOS data collection process. CSU contracted with the survey provider for customized data comparing the CSU findings to public four-year institutions nationally. Many of the questions from the survey focus on a wide range of policy issues related to academic computing and instructional technology. The responses, which are used to compare the CSU system to about 180 institutions in the same Carnegie classification, provide campus and system officials with a policy and comparative context within which the MOS metrics can be considered. Information about this project is available at <http://www.campuscomputing.net>.

Conclusion

Information technology is a major investment and strategic resource of the CSU. The MOS series documents the pervasiveness and importance of information technology in the CSU. Surveys undertaken in connection with the reports make clear that technology touches every aspect of the university's operations. The data show that in almost all of the reporting categories, technology has had a generally positive influence, sometimes dramatically so.

Although the CSU is a data-rich system in many respects, prior to the MOS the CSU did not have access to this kind of outcomes-based information about technology. The MOS informs planning for and implementation of the ITS by alerting decision makers to what is working and what is not. In that sense, it is a vehicle for organizational feedback and learning, one that has potential for nurturing an institutionalized "culture of evidence" in the policy-making process.

From the state perspective, the MOS is an example of public accountability in higher education. It is a model of negotiated accountability between a state government and the largest four-year, higher education institution in the United States.

It should be noted that the scope and depth of the data collection effort requires a significant annual investment. The data gathering and reporting activities undertaken to produce the MOS series are expensive in direct dollars and staff time. There are no shortcuts in either the design of the survey instruments or the methods to implement the research. This attention to detail increases confidence in the validity and reliability of the findings.

Stephen L. Daigle ([email protected]) is Senior Research Associate in the division of Information Technology Services at the California State University's Office of the Chancellor in Long Beach, California.