Robot Surgeon Gets a Brain
Pratt engineering professor Stephen Smith works with the novel robot surgeon.
Credit: Duke University
An autonomous tabletop robot surgeon has taken its first steps toward reality at Duke University; the device could have immediate results in making some current medical procedures safer for patients.

Engineers started with a tabletop robot that used 3-D ultrasound for vision; the 3-D images were then processed in real time, with the computer's brain directing the actions of the robot.

"In a number of tasks, the computer was able to direct the robot's actions," said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. "We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots – without the guidance of the doctor – can someday operate on people."

It has been demonstrated that this robotic system can direct catheters inside synthetic blood vessels. A second study demonstrated that the robot surgeon could successfully perform a needle biopsy, which involves the very precise insertion of a special needle to take out a tissue sample.

Science fiction writers have been thinking about this idea for a while. Although Philip K. Dick wrote extensively about the idea of a robot surgical-hand that could detach itself from the physician and work autonomously, the closest thing to this tabletop robot is probably the autodoc.