Professor Rory McCloy, Department of Surgery, used the new system for the first time during an operation carried out on a woman in her 70s, with a suspected tumour in her pancreas. The new system meant that Professor McCloy was able to refer to a six-foot wide 3D visualisation of the patient's pancreas and surrounding anatomy, projected on to the wall of the operating theatre.
Previously, surgeons have relied on a series of around 20 images taken by a CAT scanner, which they study prior to and during the operation, trying to relate the 2D images to the 3D reality of the patient in front of them. An ultrasound scanner can also be used during surgery, but again this produces 2D images that can be difficult to interpret.
Using a novel system developed by the undersigned and research students of the University's Manchester Visualization Centre, Professor McCloy was able to study a 3D visualisation of the organ in question, generated from the patient's CAT scans, during the operation. This detailed model enables the surgeon to gain a more detailed idea of what will be encountered during surgery.
Professor McCloy explained: "Trying to find and remove a tumour is like trying to find and cut out an egg from the middle of a large ham pie. This new technology should enable me to identify and cut out tumours much more accurately, with as little damage as possible to the surrounding healthy tissue."
In this particular operation, the new system was tested against intra-operative ultrasound scanning, with both images projected against the wall of the operating theatre. Professor McCloy commented: "The ultrasound scan suggested that there was a tumour in the patient's pancreas whereas the 3D model created using the new system suggested that there was not. As the surgery progressed, it became apparent that the new system was correct and the patient did not have a tumour. This meant that we avoided causing any unnecessary damage."
The data sets obtained from a modern CAT scanner can easily contain over 200 high resolution images and be over 50 megabytes in size. This size data are too large for a PC graphics card to handle. Instead, the data is sent to a Visualisation Supercomputer at the University of Manchester, an SGI Onyx2 called Kilburn, where it is reconstructed into a 3D graphic using a technique called volume rendering.
The resulting visualisation can be manipulated and sliced through any plane. The surgeon then takes a laptop and high-resolution data projector into the theatre and accesses the image using a fast, broadband Internet line. The ability to interact with complex interactive graphics applications across the Internet has only recently become possible and this medical application is an ideal way of exploiting this functionality.
Professor McCloy has worked with the undersigned for the last five years. We have adapted a gaming joystick wrapped in a sterile bag as the surgeon's interface with the computer. For ease of use, we have also designed Op3D, a user interface that jumps from one function to the next, based on a surgeon's routine examination steps. The next step is to enable the surgeon to make virtual scalpel cuts in the model, to rehearse the procedure.