Parallel pre-processing of medical data constitutes excellent cost/performance alternative

Delft 18 August 1999 Vincenzo Positano is working as a researcher at the Institute of Clinical Physiology in Pisa, which is part of the Italian Consiglio Nazionale delle Ricerche (CNR). His team is involved in a project in which a parallel vector machine (PVM) is used on a computer network to perform fast pre-processing of large medical data sets. At the occasion of Parco 99, Positano gave an overview of the 3D and 4D image processing performance of this parallel implementation in a clinical environment.

Advertisement

Vincenzo Positano is working as a researcher at the Institute of Clinical Physiology in Pisa, which is part of the Italian Consiglio Nazionale delle Ricerche (CNR). His team is involved in a project in which a parallel vector machine (PVM) is used on a computer network to perform fast pre-processing of large medical data sets. At the occasion of Parco 99, Positano gave an overview of the 3D and 4D image processing performance of this parallel implementation in a clinical environment.

Currently, 3D and 4D dynamic imaging in medical diagnosis and treatment of disease is becoming a reality. For dynamic cardiac magnetic resonance (MR) imaging, 200 to 400 image data sets are a common quantity. As a result, the image pre-processing technique forms a fundamental task in visualization and quantitative analysis of medical image data to solve the problem of the grey scale heterogeneity of tissue classes which reduces the accuracy of the segmentation algorithms.

Therefore, pre-filtering of the images becomes necessary. In particular, the non-linear anisotropic diffusion filter allows both efficient noise reduction and sharpening of object boundaries. Since this process encourages intra-region smoothing while inhibiting inter-region smoothing, the technique is often used in medical image analysis. A typical 3D medical image data set consists of a series of parallel slices, covering the zone of interest. A 3D set can be acquired several times, in order to track the evolution of physiological phenomena, such as a cardiac cycle.

The anisotropic diffusion filtering of this kind of data set becomes a very time consuming problem. In order to reduce the processing time, the anisotropic filtering algorithm has been implemented by the team of Vincenzo Positano on a heterogeneous computer network with use of the PVM libraries, which allows to exploit the computational resources that are available in a medical environment. Each image in the data set can be filtered independently, in a way that the parallel implementation can be based on a master process that distributes images on the slave processes.

Whenever a slave process ends its task, the master provides a new image to the slave process. If the number of processed images is high with respect to the amount of slave processes, this simple approach is able to obtain a good load balancing. In order to optimize the algorithm scalability, the processing algorithm applied on each image can be split into a number of sub-tasks to reduce the performance losses in the starting and ending phase. A slow slave process will delay the termination of the whole application, so this should be avoided.

The same algorithm has also been implemented with use of Message Passing Interface (MPI) techniques and MIPCH libraries to test the performances of both implementations. The MMSA application turned out to be a bit slower than the PVM application. The PVM-based algorithm for pre-filtering thus seems to be a good cost/performance compromise, which can stimulate the use of parallel processing in a medical environment. As a test case, the team has processed the dynamic sequences of 3D data volume, derived from MR cardiac images. Please, consult the home page of the Institute of Clinical Physiology for more details.


Leslie Versweyveld

[Medical IT News][Calendar][Virtual Medical Worlds Community][News on Advanced IT]