Technically it ain’t brain surgery, but let’s just say you wouldn’t want to do a cochlear implant while sleepy or distracted. So it’s a good thing this surgery robot can’t be either of those things. It drills into the bone behind the ear, watching with two shining eyes. The bit passes just half a millimeter from the facial nerve, and another half a millimeter from the taste nerve, before entering the spiraling cochlea of the inner ear. Here a human deposits an electrode.
The first robot-assisted cochlear implant in a clinical trial, which researchers describe today in the journal Science Robotics, doesn’t just enhance a surgeon’s dexterity like the by-now-common da Vinci robot might. “We are interested in doing something with the robot that a surgeon is not able to do,” says study co-author Stefan Weber of the ARTORG Center for Biomedical Engineering Research. This thing gives surgeons superpowers, allowing them to “feel” through tissue by measuring how the drill bit’s force changes against bone or flesh. All that precision means, for one, removing less bone to get to the inner ear.
This thing should also make surgeons nervous. I mean, not this robot per se, but the vanguard it’s a part of. Because while surgeons are in total control of the cochlear bot, more machines are coming that will automate much of medicine. And that future will make for a potential regulatory—and public relations—mess.
So today Science Robotics also published an editorial proposing a classification system for medical robots based on their level of automation. The scale goes from zero (no automation, like tele-operated robots) up to five (fully autonomous machines that can perform whole surgeries on their own). Level five is a long ways off, but robots are spawning so many questions about the nature of medicine, the field needs to start talking about them now.
Today, robots and surgeons still have to hold each other’s hands: The cochlear robot needs the human to tell it when to start and stop, and the human needs the robot to avoid nerves on the way to the inner ear. Call it codependency—level 1 autonomy. “The robot provides the guidance and provides for this particular task very accurate sensing, because it’s the particular task that will really affect the entire surgery,” says Science Robotics editor Guang-Zhong Yang, a co-author of the editorial.
That kind of human-robot interaction, though, is a stepping stone to truly autonomous machines. “When you come down to level 4 and 5 autonomy, the robot here is not only a medical device but also effectively practicing medicine,” says Yang. That gets sticky, because while the FDA approves medical devices, medical associations keep an eye on doctors. So do both parties get a stake in this case? (The FDA did not return answers to questions by press time, but a spokesperson did say the regulation of such devices “will depend on the intended use, not necessarily the autonomy of the device.”)
Create autonomous robots to handle wildly complex tasks like surgery and you also run the risk of losing that knowledge. If a robot replaces a certain job in the operating room, it’ll be hard to compel doctors to train up that same skill—even though it’ll probably be crucial to have skilled human backups. Doctors will essentially become robot supervisors, and universities will offer very different training, leaving a certain amount of medical knowledge to live within the robot.
Then there’s the matter of convincing the public to trust machines with their lives. Self-driving cars are one thing, but a robot doctor is something else entirely. Like with robocars, you’re better off with a fully autonomous robotic surgeon. They won’t make mistakes and they won’t tire. But that doesn’t mean humans will trust them. How do you convince someone that a cold, calculating machine may save their life one day?
All big questions that roboticists are only beginning to tackle. But one thing is for sure: Even if you never need a cochlear implant, Dr. Robot will soon be seeing you.
Go Back to Top. Skip To: Start of Article.