This paper is the intellectual property of the author(s). It was presented at EDUCAUSE '99, an EDUCAUSE conference, and is part of that conference's online proceedings. See http://www.educause.edu/copyright.html for additional copyright information.


Super-Partnerships: Computational Science Curricula, High Performance Computing and the Professional Organizations

Kris Stewart
Education Center on Computational Science and Engineering, San Diego State University, San Diego, CA

Roscoe Giles
Center for Computational Science, Boston University, Boston, MA

Ilya Zaslavsky
Education Center on Computational Science and Engineering and San Diego Supercomputer Center, San Diego, CA

Since October 1997, NSF has supported two National Supercomputing Partnerships, led by the National Center for Supercomputing Applications (NCSA), University of Illinois at Urbana-Champaign, and the San Diego Supercomputer Center (SDSC), University of California, San Diego. The goal of this program is to create and maintain the national metacomputing environment, by supporting leading-edge technology and applications research, and promoting human, technological and administrative infrastructure for ubiquitous computing. The two partnerships, namely the National Computational Science Alliance [1] and the National Partnership for Advanced Computational Infrastructure [2], unite over a hundred leading research universities and national labs over shared use of high-performance computing resources, advanced compute-intensive research, and a variety of education and outreach efforts. These efforts focus, broadly speaking, on promoting computational science curriculum in K-12, undergraduate and graduate schools, and informal learning communities, on shortening the distance between research labs and classrooms, and developing learning tools that utilize the best modern scientific accomplishments.

With the dramatic changes in computing, the need for dynamic and flexible computational science curricula becomes ever more obvious. Computational science has emerged, at the intersection of computer science, applied mathematics, and science disciplines (see Fig. 1) as a third component of science, in addition to theoretical investigation and experimentation. Mastery of computational science tools, such as 3D visualization and computer simulation, efficient handling of large data sets, ability to access a variety of distributed resources and collaborate with other experts over the Internet, etc. are now expected of university graduates, not necessarily computer science majors. However, the existing infrastructure on university campuses (human, technological, administrative) is often inadequate for the task. We described in [3] at least ten obstacles to a wider acceptance of computational science in undergraduate education; most of them are not technology related. As shown in several faculty surveys, the lack of a system of rewards encouraging time-consuming curriculum innovations, low awareness of faculty and students about available computational science tools, and the absence of a support network are among the main challenges. The joint EOT-PACI [4] activities of the two partnerships have focused on these challenges. Case studies of successful projects, presented by speakers in this panel, show that a comprehensive approach is needed in promoting computational science in university curriculum. Computational science curriculum workshops for undergraduate faculty (offered in Boston and San Diego), the SDSU/NPACI Faculty Fellows program, development and use of on-line "workbenches" (Biology Workbench, Sociology Workbench, etc.) which mobilize domain-specific Web resources for both research and teaching, participation in important campus-wide networking and security initiatives, experimentation with advanced technologies in the classroom – these and similar focused efforts, when propagated throughout the two partnerships, appear to have impact beyond the individual teaching experience, and scale to the national level.

The emerging infrastructure for computational science in education will be successful if it uses the established intertwined professional networks and societies from both educational and computer technology fields. EDUCAUSE is a prime example of such an organization which may support computational science curriculum development through sharing expertise and providing an environment for disseminating new technologies and approaches, and a forum for vendors, educators and researchers. Inter-disciplinary exchange of ideas, methodologies and experiences, under the EDUCAUSE framework, is an important component in establishing a scalable nation-wide comprehensive infrastructure to support computational science education.

[1] NCSA: http://www.ncsa.uiuc.edu

[2] NPACI: http://www.npaci.edu

[3] Stewart, Zaslavsky, 1998 "Building the Infrastructure for High Performance Computing in Undergraduate Curricula: Ten Grand Challenges and the response of the NPACI Education Center. Supercomputing’98 Proceedings. Available online at http://www.supercomp.org/sc98/TechPapers/sc98_FullAbstracts/Stewart1310/index.htm

[4] EOT-PACI: http://www.eot.org

Figure 1
Fig. 1. Computational science emerges at the intersection of computer science, applied mathematics, and science disciplines.


Summaries of individual presentations:

1. Building a faculty community to support curriculum development in computational science and engineering (Kris Stewart).

Education Center on Computational Science and Engineering was established on the campus of San Diego State University in October 1997, as part of education and outreach activities of the National Partnership for Advanced Computational Infrastructure (NPACI). From inception, its focus has been on building a comprehensive educational infrastructure to support the incorporation of high-performance computational science tools into undergraduate education. Through the campus-wide Faculty Fellows program, collaboration with NPACI and NCSA researchers, in-house project development, and various outreach efforts, we have established an environment encouraging the curriculum enhancement in sciences and engineering with modern simulation and visualization technologies.

With over 31,000 student body, SDSU is the largest university within the California State University (CSU) system, which in turn, is the largest and most diverse undergraduate system in the nation. The diversity of students is reflected in diverse faculty, diverse teaching techniques and methods, emphasis on and attention to individual learning and teaching styles. We examined SDSU faculty expectations and practices in teaching with computers, based on a series of questionnaire surveys and interviews. Analysis of faculty use of the Web, use of computers in the classroom by students and instructors, and the use of high performance computing applications in the curricula, helped us develop Ed Center strategies of curriculum change.

During the first year we focused on broad dissemination of the available computational tools, by presenting to various faculty audiences at SDSU and other campuses of the California State University system, and by experimenting with various computational technologies in our own classroom teaching. The main focus of the second year activities has been on individual collaboration with selected SDSU faculty, and on in-house computational science projects. The Faculty Fellows program initiated by the Ed Center, have provided release time to three faculty members selected from Colleges of Sciences, of Engineering, and of Arts and Letters, who work on changing their regular courses to include computational approaches. This support allowed the faculty fellows to use various compute-intensive approaches in undergraduate teaching ranging from interpretation of satellite imagery and web-based collaborative visualization of large geological datasets, to the exploration of the Network of Workstations (NOW) distributed architecture implemented on a cluster of SUN workstations at the College of Engineering, to investigating new Web-based 3D visualization strategies for geographic data in an experimental class composed of geographers and computer scientists. Presentations of the faculty fellows’ projects to University administrators, to various professional audiences, and positive feedback from students, became an important testimonial to the initial success of the program. The synergy developed through a series of faculty fellows meetings and discussions at the Ed Center, has resulted in collaborative research projects involving faculty from different colleges.

The Ed Center also organized a Computational Science in Undergraduate Curricula workshop for CSU faculty, has developed several in-house projects on distance learning, networking, and on-line analysis tools. These projects are also discussed in the presentation.

For more information visit our Web pages at http://www.edcenter.sdsu.edu

2. Repositories and Online Tools (Roscoe Giles)

We are at an important transition in computational science and education. As more computer resources, instruments, information repositories, and collaborators join the worldwide information network, we are moving from static data repositories to "workbenches" and "portals" which are designed to support science and education more effectively. The two NSF Partnerships for Advanced Computational Infrastructure (NPACI and the Alliance) are pioneering the development of such online tools.

This presentation summarizes some of the technologies that underlie the development of advanced online tools and shows some examples of directions being taken to support computational science education. We hope to stimulate useful discussions of requirements and design goals for new tools.

The Biology Workbench, introduced in an earlier presentation, has served as a successful prototype of several aspects of a scientific portal: availability of a wide variety of data and computational tools, persistent user state, and behind the scenes translations of data formats. We have worked with the Biology Workbench for education and will describe some of the additions/extensions that support collaboration and learning by novices.

We then turn to a discussion of the expected impacts of newer web technology such as XML on the efforts to build effective computational science repositories.

3. Sociology Workbench, an analytical interface to distributed resources for social scientists (Ilya Zaslavsky)

Both the NPACI and NCSA programs have a strong focus on "hard" sciences. Indeed, such academic fields as computational physics, molecular biology, computational chemistry, climate modeling, have traditionally used high-performance computing to process large volumes of data, simulate and visualize natural phenomena. The majority of high performance computational science tools developed, in particular, at the San Diego Supercomputer Center, target the community of researchers in sciences and engineering. In the social sciences and humanities, the use of computational science tools, especially in the classroom, has been limited. Addressing this curriculum need, we have developed a collection of on-line computational tools and resources for social scientists, codenamed "Sociology Workbench" (SWB). The SWB allows faculty and students to share and analyze social science data (questionnaire surveys, public opinion polls, and similar data) on the Web. The major advantage of our workbench is its ability to examine datasets supplied by the user, in addition to widely used public domain data such as the General Social Survey. In essence, it is a free on-line statistical package implementing a unique data analysis methodology.

The Sociology Workbench continues the technology of on-line "workbenches" being developed for various disciplines within the supercomputing realm (as, for example, the Biology Workbench developed at NCSA (http://biology.ncsa.uiuc.edu/), the Scientist’s Workbench from Cornell (http://www.tc.cornell.edu/SWB/swb.html), the NASA’s Environment Workbench (http://satori2.lerc.nasa.gov/DOC/EWB/ewbhome.html), etc.). Instead of re-writing widely available on the desktop statistical procedures for categorical data analysis, we emphasized exploratory social data analysis, integration with other resources available on the Web, convenience of the user interface, and transparency of the analytical approaches. The core of SWB follows the methodology of the Analysis of Rules (Determinacy Analysis), a method for extracting explanatory rules from series of qualitative variables such that these rules are as accurate and complete as possible. In addition, users can build custom variables and add them to the "dictionary of variables", construct standard frequency tables and cross-tabulations, pie- and bar-charts.

SWB is being developed by mostly undergraduate students working as interns with the Education Center on Computational Science and Engineering on the campus of San Diego State University. It is expected to serve as a teaching tool in classes dealing with survey data analysis, as well as an engine behind portals which provide analytical access to discipline-specific survey datasets. The software has already been used for analysis of faculty surveys, student surveys in several classes, by local planning agency, etc. As a research testbed, we are using SWB for analysis of interfaces to distributed data sources within a wrapper-mediator architecture. You are welcome to analyze your surveys with SWB, it is accessible from http://edcenter.sdsu.edu.