The University of Delaware team includes Kenneth Barner, professor and department chair person; Karl Steiner, professor and associate provost for interdisciplinary research initiatives; and Rui Hu, a doctoral candidate in electrical engineering.
The lead institution on the project is Thomas Jefferson University (TJU), with the TJU team including biochemist Eric Wickstrom, who acts as principal investigator on the project, radiologist Matthew Thakur, surgeon John Kairys, medical educator Martha Ankeny, computer specialist Devakumar Devadhas, synthetic chemist Chang-Po Chen and biochemistry doctoral candidate Yuan-Yuan Jin. The initial work will focus on the pancreas, as pancreatic cancer grows rapidly and presents difficult surgical challenges.
Under the University of Delaware component of the project, Kenneth Barner and Karl Steiner will build upon their earlier research in 3D virtual surgery simulation and work with their medical colleagues at TJU to create the next generation of a haptics-based virtual surgery simulator.
While radiologic images give surgeons a visual representation of what they may encounter at the time of exploration, current imaging systems do not provide genetic information or tactile information about the tissues that will be encountered at surgery, nor do they allow physical interaction with the image.
"Haptics provides tactile, or touch, feedback to the user via a small robot that is integrated with the visual simulation on the screen", Karl Steiner explained. "As the user moves the robot, a simulated object, such as a scalpel or other surgical instrument, moves within the 3D environment, which includes simulations of various organs in the human body."
The organ simulations have been generated through a process called segmentation, where data taken from anatomical CT scans and molecular or genetic PET scans are digitally processed slice-by-slice to extract the outlines of individual organs in a patient. Once the internal structure of the body has been segmented, that data is processed and the organs are integrated with a volumetric simulation that depicts mechanical properties, for example, healthy tissue or diseased tissue in a lung or pancreas. Next, the biochemical activity inside cancer cells is fused with the anatomical image.
The haptics interface then allows manipulation of the surgical instruments, and, as the instrument touches one of the simulated organs, the deformation of the organ is calculated and visualized. Information is provided to the haptics robot as to the forces required to push on the organ. Tumour texture will be represented as firmer than normal, healthy pancreas.
"This environment is the basis for our new collaboration with TJU", stated Karl Steiner, "where we will now focus on a set of data from CT and PET scans provided by TJU. We will merge these datasets to provide a scene with state-of-the-art information about the disease state of the organ."
The project is yet another link among research groups at the University of Delaware and TJU, and the research benefits from and contributes to the growing collaboration under the Delaware Health Sciences Alliance.
"The unique aspect of this project", Kenneth Barner stated, "is that it enables us to build on our prior results for deformable objects in surgery simulation by partnering with researchers at Jefferson. Our aim is not only to take surgery simulation to the next level, including the realistic interaction of multiple surgical tools and organs, but also to incorporate information from multiple imaging modalities to provide doctors with a comprehensive environment from which surgeries can be practised and planned."
"The project is also a great opportunity for our graduate students", he added, "as engineers will find it increasingly important to work with professionals from other fields as technology becomes more complex and as its applications broaden."