Legal problems
So where did Asimov's Three Laws of Robotics go? "They were narrative devices and were never actually intended to work in the real world," says Dr. Whitby. Aside from the fact that the robot would have some kind of human-like intelligence, which robots still lack, the laws themselves don't actually work very well. Asimov repeatedly subverted them in his robot stories, repeatedly showing how these seemingly unquestionable rules could produce unintended consequences.
"In any case," says Dr. Inoue, "the laws really only contain common sense principles that are already applied to the design of most modern appliances, both domestic and industrial. Every toaster, lawnmower, and cell phone is designed to minimize potential damage—yet people can still electrocute themselves, lose fingers, or fall out of windows just to get a better signal. For robots, there must be strict safety standards that cover existing products. The question is whether new, specific rules for robots are needed—and if so, what they should contain."
"Ensuring robot safety will be crucial," says Colin Angle of iRobot, which sells the six-foot-tall "Roomba" robot vacuum cleaners for households. He argues, however, that his company's robots are actually safer than some popular toys. "A remote-controlled car driven by a six-year-old is far more dangerous than a Roomba." If you step on a Roomba, it won't cause you to fall over; instead, a rubber pad will hold it in place and prevent it from moving. "Existing regulations will meet the challenges. I'm not convinced that robots are different enough to deserve special treatment," Angle said.
A robot's safety can be used as evidence in civil product liability cases. "When the first carpet-sweeping robot sucks up a child, who will be to blame?" asks John Hallam, a professor at Odense University in southern Denmark. If a robot is autonomous and capable of learning, can its designer be held responsible for all its actions? Today, the answer to most such questions is yes. "But as robots become more complex, they will become less uniform," says Hallam.
"Currently, no insurance company is prepared to insure robots," said Dr. Inoue. "However, this will have to change," he added. Last month, Japan's Ministry of Trade and Industry issued a set of safety guidelines for robots operating in homes and offices. It will require that each robot have sensors to avoid collisions with people, that all robots be made of soft and lightweight materials to minimize potential damage, and that they have buttons to turn them off in case of an emergency. "This was largely due to a large robot exhibition organized by the authorities in the summer of 2005; the authorities realized there was a danger when thousands of people not only observed robots but also got between them," said Dr. Inoue.However, Mr. Angle suggests that the idea of generalized application and widespread use of robots capable of learning is misplaced. He believes it's likely that robots will be relatively dumb machines designed for specific tasks. Rather than a robot maid, we'll see a "heterogeneous swarm of housekeeping robots," Angle says.
"Probably the most controversial area of robotics will be the creation of erotic robot toys," said Dr. Christensen. "People in five years will want to have sex with robots. Initially, these robots will be quite basic, but it's unlikely they will replace humans," he added. "People can have sex with inflatable dolls; initially, something that moves will be an improvement. To some, it might seem like harmless fun, but without some kind of regulation, it seems only a matter of time before someone starts selling robotic, gendered dolls that resemble children." This is dangerous ground. Convicted pedophiles might suggest that such robots could be used as a form of therapy if others objected to the idea that it could satisfy an extremely dangerous fantasy.
All of this raises other questions. How, by exploiting physical danger, could robots also be dangerous to humans in less direct ways, by bringing out their worst traits, from starting wars to pedophilia? Ron Arkin, a roboticist at the Georgia Institute of Technology in Atlanta, asks another: "If you kick a robotic dog, would you kick a real one?" Roboticists can do everything they can to keep robots safe—but they can't reprogram the behavior of their human owners.
Komentarze
Prześlij komentarz