TOUCH project uses Access Grid to provide distributed interactive virtual environment for collaborative medical education and training

Albuquerque 17 January 2004At the recent "Medicine Meets Virtual Reality 12" Conference, held in Newport Beach, California, the TOUCH project, a multi-year programme initiated in August 2000, was presented to the audience. TOUCH is a collaboration between the Schools of Medicine at the University of Hawaii and the University of New Mexico, which was developed to integrate advanced technologies into medical education and training to enhance experiential distributed learning, aimed at improving human comprehension, retention of knowledge, and ultimate performance. The project employs medical case scenarios as virtual models to make learning of critical concepts relevant and translatable to real-life application. TOUCH deploys and distributes these methods to remote training sites using the Next Generation Internet (NGI) Access Grid.


TOUCH stands for Telehealth Outreach for Unified Community Health. The project team developed immersive virtual reality applications and created virtual worlds to enhance learning by allowing visualization and exploration of abstract concepts or virtual patients and simulations, along with collaborative virtual group interaction.

The virtual reality visualization environment "Flatland" has been implemented to be shared among dispersed sites with multiple participants. Flatland is an open source programme into which the virtual simulation can be programmed and rendered for visualization and interaction. Access Grid, a broadband Next Generation Internet videoconference multi-casting network serves to allow real-time distribution of the learning environments between academic medical centres and distant training sites. The project partners also introduced an experiential learning approach using realistic medical scenarios and associated models in the virtual environments to better understand complex concepts and translate them to real-world applications.

The Schools of Medicine in Hawaii and New Mexico, in collaboration with rural hospitals or clinics and two of their respective training sites, the Maui Community College Health Center and the Northern Navajo Medical Center, adopted high performance computing methods to enhance and deploy existing experiential learning curricula currently used in the medical education programmes.

Participating students, faculty tutors and rural preceptors can make use of virtual reality experiences, three dimensional volumetric image manipulation, and computer-generated simulations transmitted over the Internet Access Grid, using point-to-point or multi-casting connectivity, to improve learning and understanding of a variety of concepts relevant to a clinical case study. These methods can be evaluated interactively across sites and individually and asynchronously.

Experiential learning is able to promote medical student group involvement in the study of basic concepts and principles as they relate to a variety of clinical problems. Through this process, students begin to understand the relevance of important concepts to actual medical practice and problem solving. In order to enhance the experiential learning, create a greater sense of reality, and better comprehend relevant basic concepts, virtual reality and 3D graphical simulations are being integrated into study cases into which students can be fully immersed in order to achieve defined learning or training goals and objectives.

The immersed students or trainees wear a head-mounted display with trackers, which allows them to gain a sense of presence and interact within the virtual environment. Trainees can independently control their viewpoint and motion within the virtual world. Team members within the virtual environment are able to see each other as full human figures and interact as if they were physically present even when separated by significant distances. TOUCH also offers third person modes of communication using virtual remote cameras for broadcasting to remote audiences.

Participants can examine a virtual patient in order to discover pertinent signs and symptoms. The virtual reality patient is programmed to change dynamically over time and respond to the manipulations by the learner or the trainee. A joy-wand enabes the students to navigate, move and handle objects. If a critical error is made, the virtual patient respondes poorly or, even worse, expires.

The Flatland environment allows real-time exploration, examination, and manipulation of 3D objects and images within the virtual world. The ratio between real time and virtual time can be varied to allow slower or faster progress of events. The relative size of the participant to objects within the virtual worlds will be variable to allow visual extensions and experiences to larger environments at a population level or smaller microscopic environments at the cellular or molecular level.

An interactive patient simulation engine allows a new dimension in experiential learning, in which the students can dynamically determine the direction of the case scenario. The simulator is composed of three components:

  1. a real-time artificial intelligence (AI) simulation engine,
  2. a 3D virtual reality environment with human avatars, and
  3. a system for human-simulation interaction.

The AI system reasons with case specific clinical knowledge in the form of rules, extracted from team medical experts using knowledge engineering methods. The AI engine is coupled to the virtual environment that contains a representation of the virtual patient, which manifests the signs and symptoms of the clinical scenario and provides a unique environment for experiential learning, as explained by the project partners.

In one of their more recent experiments, the TOUCH partners asked fourth year medical students to manage a simulated patient with a closed head injury at individual Virtual Reality workstations at a single location at the University of Hawaii or at the University of New Mexico, either distributed over the Internet2 Access Grid or locally at each medical school. After comparison with other groups of students who did not use the Virtual Reality simulation, the project team discovered that most students had gained knowledge after sessions in pre- and post-tests with no significant difference between VR and non-VR users.

Yet, students who experienced learning in a safe virtual reality environment (VRE) stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning but also remembering those principles. In any case, students need time to adapt and practise in order to improve efficiency, as stated by the TOUCH partners.

As an additional proof of concept, students from the University of Western Australia in Perth were allowed to interact with students at the University of New Mexico in Albuquerque within the distributed VRE scenario, bridging a distance of nearly 10.000 miles with no discernable time lage. The project team also demonstrated that the TOUCH collaborative medical training system requires relatively low bandwidth for communication, which means that low-bandwidth networks are sufficient enough to run the TOUCH system.

The TOUCH project was funded by grant 2 D1B TM 00003-03 from the Office for the Advancement of Telehealth, Health Resources and Services Administration, Department of Health and Human Services. More information is available at the TOUCH Web site.

Leslie Versweyveld

[Medical IT News][Calendar][Virtual Medical Worlds Community][News on Advanced IT]