ISAAC ASIMOV would probably have been horrified at the experiments under way in a robotics lab in Slovenia. There, a powerful robot has been hitting people over and over again in a bid to induce anything from mild to unbearable pain - in apparent defiance of the late sci-fi sage's famed first law of robotics, which states that "a robot may not injure a human being".
But the robo-battering is all in a good cause, insists Borut Povše, who has ethical approval for the work from the University of Ljubljana, where he conducted the research. He has persuaded six male colleagues to let a powerful industrial robot repeatedly strike them on the arm, to assess human-robot pain thresholds.
It's not because he thinks the first law of robotics is too constraining to be of any practical use, but rather to help future robots adhere to the rule. "Even robots designed to Asimov's laws can collide with people. We are trying to make sure that when they do, the collision is not too powerful," Povše says. "We are taking the first steps to defining the limits of the speed and acceleration of robots, and the ideal size and shape of the tools they use, so they can safely interact with humans."
Povše and his colleagues borrowed a small production-line robot made by Japanese technology firm Epson and normally used for assembling systems such as coffee vending machines. They programmed the robot arm to move towards a point in mid-air already occupied by a volunteer's outstretched forearm, so the robot would push the human out of the way. Each volunteer was struck 18 times at different impact energies, with the robot arm fitted with one of two tools - one blunt and round, and one sharper.
The volunteers were then asked to judge, for each tool type, whether the collision was painless, or engendered mild, moderate, horrible or unbearable pain. Povše, who tried the system before his volunteers, says most judged the pain was in the mild to moderate range.
The team will continue their tests using an artificial human arm to model the physical effects of far more severe collisions. Ultimately, the idea is to cap the speed a robot should move at when it senses a nearby human, to avoid hurting them. Povše presented his work at the IEEE's Systems, Man and Cybernetics conference in Istanbul, Turkey, this week.
"Determining the limits of pain during robot-human impacts this way will allow the design of robot motions that cannot exceed these limits," says Sami Haddadin of DLR, the German Aerospace Centre in Wessling, who also works on human-robot safety. Such work is crucial, he says, if robots are ever to work closely with people. Earlier this year, in a nerve-jangling demonstration, Haddadin put his own arm on the line to show how smart sensors could enable a knife-wielding kitchen robot to stop short of cutting him.
"It makes sense to study this. However, I would question using pain as an outcome measure," says Michael Liebschner, a biomechanics specialist at Baylor College of Medicine in Houston, Texas. "Pain is very subjective. Nobody cares if you have a stinging pain when a robot hits you - what you want to prevent is injury, because that's when litigation starts."