For their experiments, the engineers started with a rudimentary tabletop robot whose "eyes" used a novel 3D ultrasound technology developed in the Duke laboratories. An artificial intelligence programme served as the robot's "brain" by taking real-time 3D information, processing it, and giving the robot specific commands to perform.
"In a number of tasks, the computer was able to direct the robot's actions", stated Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. "We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence programme, the technology will advance to the point where robots - without the guidance of the doctor - can someday operate on people."
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published on-line in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in April in the journal Ultrasonic Imaging, demonstrated that the autonomous robot system could successfully perform a simulated needle biopsy.
Advances in ultrasound technology have made these latest experiments possible, according to the researchers, by generating detailed, 3D moving images in real-time. The Duke laboratory has a long track record of modifying traditional 2D ultrasound - like that used to image babies in utero - into the more advanced 3D scans. After inventing the technique in 1991, the team also has shown its utility in developing specialized catheters and endoscopes for real-time imaging of blood vessels in the heart and brain.
In the latest experiment, the robot successfully performed its main task: directing a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft. The robot's needle was guided by a tiny 3D ultrasound transducer, the "wand" that collects the 3D images, attached to a catheter commonly used in angioplasty procedures.
"The robot was able to accurately direct needle probes to target needles based on the information sent by the catheter transducer", stated John Whitman, a senior engineering student in Stephen Smith's laboratory and first author on both papers. "The ability of the robot to guide a probe within a vascular graft is a first step toward further testing the system in animal models."
While the research will continue to refine the ability of robots to perform independent procedures, the new technology could also have more direct and immediate applications. "Currently, cardiologists doing catheter-based procedures use fluoroscopy, which employs radiation, to guide their actions", Stephen Smith stated. "Putting a 3D ultrasound transducer on the end of the catheter could provide clearer images to the physician and greatly reduce the need for patients to be exposed to radiation."
In the earlier experiments, the tabletop robot arm successfully touched a needle on the arm to another needle in a water bath. Then it performed a simulated biopsy of a cyst, fashioned out of a liquid-filled balloon in a medium designed to simulate tissue. "These experiments demonstrated the feasibility of autonomous robots accomplishing simulated tasks under the guidance of 3D ultrasound, and we believe that it warrants additional study", John Whitman stated.
The researchers said that adding this 3D capability to more powerful and sophisticated surgical robots already in use at many hospitals could hasten the development of autonomous robots that could perform complex procedures on humans. The research in Stephen Smith's lab is supported by the National Institutes of Health. Other Duke members of the team were Matthew Fronheiser and Nikolas Ivancevich.