Beyond the sheltering sky of supercomputing in isolation beckons the computing continuum

Heidelberg 21 June 2001First speaker of the afternoon session at the SC 2001 Conference of Supercomputing was Sid Karin, Ph.D. from the University of California, San Diego, and former Director of the National Partnership for Advanced Computational Infrastructure (NPACI). Dr. Karin illustrated how the digital revolution is transforming high-performance computing from an isolated island into a multi-dimensional computing continuum. Frontiers between different technologies are breaking down rapidly and computing is turning into a universe of omnipresent and invisible interconnectivity. Today, we look in amazement to phenomena such as grid computing but frankly, we have not really seen anything yet.

Advertisement

The computing continuum concept involves a variety of dimensions, according to Dr. Karin, including performance, collaboration, integration, location, and function. In contrast with digital performance, the human information intake is limited to about one gigabyte per second, remains constant throughout a life time, and mostly is visual. Human performance can be expanded via computational collaboration in remote teams, resulting in low-end and high-end applications used for commercial and daily routines (banking services, train traffic, car design, stock exchange) and academic research (tele-collaboration, digital libraries, supercomputing). Connectivity codes, e.g. TCP/IP, are stored and compressed on a chip.

New ways of scientific 3D visualisation are emerging in tele-manufacturing environments. The speaker mentioned research in challenging disciplines such as astronomy and meteorology. The face of science is definitely changing when researches are able to collect data from digital libraries, analyse information with simulation models run on the grid, visualise and share these models on-line, and publish results in a digital library. As such, a vivid interaction is born between theory, simulation, and experiment leading to numerically and data intensive computing and data mining in both a computational and data grid.

An indispensable condition for this type of continuum is the seamless access to servers for computation, application, and information purposes, as Dr. Karin described. Future research will have a greater need for specific services rather than resources, with a preference for access to persistent data stored in millions of collections instead of limited data sets which belong to single users. Research will become a three-fold art of discovering the right collection of information, move the data from library to application, and stream it through multiple cache levels.

A profound knowledge of the information management hierarchy, consisting of data models, administration domains, digital libraries, meta-data catalogues, resource brokerage models, and archival systems, could mean a big help. The ideal scenario would be a steady evolution towards common digital libraries, computational grids, and persistent archives since the same technology is required for federation in space and migration in time, as Dr. Karin put it. As for the concept of location, connectivity has to be everywhere, so it is high time to map the network's Terra Incognita, just like in the old days of Columbus. In this regard, Dr. Karin placed Gilder's versus Moore's Law with respect to bandwidth and processor performance.

Once all the premises are fulfilled to create a Super-Internet architecture, researchers and common users will enter into a world of infinite access to information adhering to the scientific, engineering, public policy, commercial, communicative, educational, and entertaining domains. This virtual so-called "America On-Line" (AOL) system would include over 40.000 processors, more than 4 terabytes of RAM, serving over one million simultaneous users on a 7x24 basis, and surpassing one billion dollar per year spent on operating expenses. In reality, it is a hell of a job to make scalable parallel computers as easy to use as scalar or vector computers, as Dr. Karin tried to bring the SC 2001 audience back to earth.

In the meanwhile, NPACI is already deploying a host of fascinating scientific grid collaboratories. Dr. Karin proudly introduced distributed initiatives in the domains of protein folding to discover and understand protein structure space, e.g. the CHARMM-Legion project and the protein data bank; cellular micro-physiology on the grid and a bioinformatics infrastructure for large-scale analysis at UCSD; MICE, a Molecular Interactive Collaborative Environment; and the Alliance for Cellular Signalling. An insight in brain structure function with relationship to health and disease is given in the brain mapping project, involving all NPACI technology thrusts.

Biological scale modelling is performed in a grid collaboration between the Universities of Kansas, New Mexico, and the San Diego Supercomputing Center. A unique partnership involving 31 U.S. agencies constitutes the San Diego Regional Ecology project. Other examples are the Digital Galaxy, the Digital Sky, and the Art of Managing Art by the Art Museum Image Consortium (AMICO). In addition, meta-projects are being established to build tele-collaboration environments, such as the NPACI GridPort Architecture which will create a Web interface for grid computing, as well as to clear out policy issues relating to problems like cryptography, trademarks and copyright, liability, convergence, and certification. Following Dr. Karin's presentation, the audience was not so sure anymore whether the sky is still the limit.


Leslie Versweyveld

[Medical IT News][Calendar][Virtual Medical Worlds Community][News on Advanced IT]