Loading...
post-template-default single single-post postid-5980 single-format-standard

New Role for Robot Warriors

Alex Constantine - February 28, 2010

Drones Are Just Part of Bid to Automate Combat

TermSalvation militaryrobot 420 90 - New Role for Robot WarriorsBy GREGORY M. LAMB
ABC News | Feb. 21, 2010

Science fiction sometimes depicts robot soldiers as killing machines without conscience or remorse. But at least one robotics expert today says that someday machines may make the best and most humane decisions on the battlefield. Insurgents in Iraq have used a 26 piece of equipment to hack U.S. drones.Guided by virtual emotions, robots could not only make better decisions about their own actions but also act as ethical advisers to human soldiers or even as observers who report back on the battlefield conduct of humans and whether they followed international law. As militaries around the world invest billions in robotic weapons, no fundamental barriers lie ahead to building machines that "can outperform human soldiers in the battlefield from an ethical perspective," says Ronald Arkin, associate dean at the School of Interactive Computing at Georgia Institute of Technology in Atlanta. The result would be a reduction in casualties both for soldiers and civilians, he says.

Virtual Ethics Based on Concept of Guilt

Dr. Arkin has begun work on an ethical system for robots based on the concept of "guilt." As a robot makes decisions, such as whether to fire its weapons and what type of weapon to use, it would constantly assess the results and learn.

If the robot established that its weapons caused unnecessary damage or casualties, it would scale back its use of weapons in a future encounter. If the robot repeatedly used excessive force, it would shut down its weapons altogether – though it could continue to perform its other duties such as reconnaissance.

"That's what guilt does in people, too, at least in principle," Arkin says. "Guilty people change their behavior in response to their actions."

Though "Terminator"-style warriors will likely remain fictional long into the future, thousands of military robots are already operating on land, sea, and in the air, many of them capable of firing lethal weapons. They include missile-firing Predator and Reaper aircraft used by the American military in Iraq and Afghanistan, remotely controlled by human soldiers.

Some Skeptical About Ethical Robots
Naval ships from several nations employ Phalanx gun systems (sometimes called "R2D2s" on American ships, referring to the robot from "Star Wars"), capable of shooting down incoming planes or missiles without command or targeting from a human.

South Korea has deployed armed robotic systems along its demilitarized zone with North Korea. The Israeli army patrols its borders with Gaza and Lebanon with roving unmanned ground vehicles.

Among systems being developed for the future by the US military are the Vulture, a pilotless helicopter that could stay aloft for up to 20 hours, and an unmanned ground combat vehicle.

26163355 e39d40f6312 - New Role for Robot WarriorsBut Arkin's sunny forecast for the future of ethical robot warriors has met with deep skepticism among some in his field.

New laws will be needed to constrain the use of autonomous (without a human operator) robot weapons on the battlefield, argues Noel Sharkey, a professor of artificial intelligence and robotics at Britain's University of Sheffield, in an essay last year.

Artificial Intelligence Expert: Can Robots Distinguish Between Friend and Foe?
He sees two huge ethical hurdles in the way: One is the unproven ability of robotic weapons to discriminate between friend and foe, avoiding hitting civilians and other noncombatants. The other questions a computer's ability to judge "proportionate" force, enough to gain military advantage while minimizing civilian casualties.

Both these concepts are seen as essential to waging ethical warfare.

"Humans understand one another in a way that machines cannot," Dr. Sharkey writes. "Cues can be very subtle and there are an infinite number of circumstances where lethal force is inappropriate." International talks on the use of autonomous robots in warfare "are needed urgently," he says.

What experts do agree on is that today's robots fall far short of the kind of artificial intelligence (AI) they need to operate effectively on their own, let alone act ethically.

"When it comes to high-stakes decisions that hinge on understanding subtle human preferences and intentions, we still have a bunch of research to do at the level of basic science to give AI systems the same common sense that people have," says Eric Horvitz, principal researcher at Microsoft Research and immediate past president of the Association for the Advancement of Artificial Intelligence.

Limitations Could Be Placed on Military Robots

The AAAI plans to issue a report later this year on the challenges and opportunities presented by the growing relationship between humans and AI.

Arkin's work is of great interest to Patrick Lin, director of the Ethics and Emerging Sciences Group at California Polytechnic State University in San Luis Obispo. His group is working on software for military robots that would include ethical behavior.

"We're sending machines to do our dirty work," says Dr. Lin, who is also an ethics fellow at the US Naval Academy and is working on a book on robotic ethics. The military should adopt a slow "crawl, walk, run philosophy" to introduce autonomous robots to the battlefield, he says.

Many kinds of limits could be put on military robots, Lin says. For example, their use could be confined to a "kill box," an area known to contain only enemy troops. They could be authorized to fire only on opposing robots or other nonhuman targets. Or they might carry only nonlethal weapons, such as water cannons or devices that only stun.

Right now, Arkin argues, the discussion about the use of robots in war is in its infancy. It reminds him of the attitudes people held before the Wright brothers flew the first airplane. At that time, some wondered if humans were intended to fly at all or whether flying could ever become a safe thing to do.

Robots don't need to have a perfect ethical record to win a place on the battlefield, Arkin argues. They only have to behave better than humans, who have a deeply flawed ethical record in warfare.

"This is not about robot armies out to kill us," Arkin says. "This is actually about reducing noncombatant fatalities."

http://abcnews.go.com/Technology/role-robot-warriors/story?id=9889008&page=1

Leave a Reply

Your email address will not be published. Required fields are marked *

  • Similar Posts