The government of South Korea is drawing up a code of ethics to prevent human abuse of robots - and vice versa.
The so-called Robot Ethics Charter will cover standards for robotics users and manufacturers, as well as guidelines on ethical standards to be programmed into robots, South Korea's Ministry of Commerce, Industry and Energy announced last week.
"The move anticipates the day when robots, particularly intelligent service robots, could become a part of daily life as greater technological advancements are made," the ministry said in a statement.
A five-member task force that includes futurists and a science-fiction writer began work on the charter last November.
Gianmarco Veruggio of the School of Robotics in Genoa, Italy, is recognized as a leading authority on roboethics.
"Robotics is a new science with a manifold of applications that can assist humans and solve many, many problems," he said.
"However, as in every field of science and technology, sensitive areas open up, and it is the specific responsibility of the scientists who work in this field to face this new array of social and ethical problems."
Abusing RobotsSouth Korea boasts one of the world's most high-tech societies.
The country's Ministry of Information and Communication is working on plans
to put a robot in every South Korean household by 2020.
The new charter is part of an effort to establish ground rules for human interaction with robots in the future.
"Imagine if some people treat androids as if the machines were their wives," Park Hye-Young of the ministry's robot team told the
AFP news agency.
The main focus of the charter appears to be on dealing with social problems, such as human control over robots and humans becoming addicted to robot interaction.
The document will also deal with legal issues, such as the protection of data acquired by robots and establishing clear identification and traceability of the machines.
Technological advances have introduced new models of human-machine interface that may bring different ethical challenges, said Veruggio, the Italian scientist.
"Think of bio-robotics, of military applications of robotics, of robots in children's rooms," he said.
Laws of RoboticsThe South Korean charter, which may include guidelines for the robots themselves, could be seen as a nod to Isaac Asimov's three laws of robotics.
Familiar to many science-fiction fans, the laws were first put forward by the late sci-fi author in his short story "
Runaround" in 1942.
The laws state that robots may not injure humans or, through inaction, allow humans to come to harm; robots must obey human orders unless they conflict with the first law; and robots must protect themselves if this does not conflict with the other laws.
Robot researchers, however, say that Asimov's laws - and the South Korean charter - belong in the realm of science-fiction and are not yet applicable to their field.
"While I applaud the Korean effort to establish a robot ethics charter, I fear it might be premature to use Asimov's laws as a starter," said Mark Tilden, the designer of RoboSapien, a toylike robot.
"From experience, the problem is that giving robots morals is like teaching an ant to yodel. We're not there yet, and as many of Asimov's stories show, the conundrums robots and humans would face would result in more tragedy than utility," said Tilden, who works for Wow Wee Toys in Hong Kong.
Hiroshi Ishiguru of Osaka University is the co-creator of Repliee Q1 and Q2, a female android. (See photos of
Repliee Q1 and a video of
Repliee Q2.)
He says it's too soon to dictate the future of robot ethics based on Asimov's laws.
"If we have a more intelligent vehicle, who takes the responsibility when it has an accident?" he said.
"We can ask the same question of a robot. Robots do not have human-level intelligence. It is rather similar to a vehicle today."
I wonder what law all the military robots/drones respond to
[Link]