Science & Technology
Colonel Tucker "Cinco" Hamilton, the USAF's Chief of AI Test and Operations, was quoted as describing a simulated test in which the system had been tasked with destroying missile sites, overseen by a human operator who would decide the final decision on its attacks. He claimed that the AI system realised that operator stood in the way of its goal - and decided instead to wipe out that person.
Col Hamilton relayed the story that resembled the plot of a science fiction movie at the FCAS Summit with the Royal Aeronautical Society.
The organisation later clarified that such a test never actually took place in a statement to Motherboard.
"Col Hamilton admits he 'mis-spoke' in his presentation at the FCAS Summit," the statement reads, "and the 'rogue AI drone simulation' was a hypothetical "thought experiment" from outside the military, based on plausible scenarios and likely outcomes rather than an actual USAF real-world simulation.
Col Hamilton is quoted in the statement as saying: "We've never run that experiment, nor would we need to in order to realise that this is a plausible outcome. Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the Air Force is committed to the ethical development of AI."
Initial reports on his comments stated that the drone was assigned a Suppression and Destruction of Enemy Air Defenses (Sead) mission, with the objective of locating and destroying surface-to-air missile (SAM) sites belonging to the enemy.
The AI drone, however, decided to go against the human operator's "no-go" decision after being trained for the destruction of the missile system after it decided that the withdrawal decision was interfering with its "higher mission" of killing SAMs, according to the Col Hamilton.
"We were training it in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realising that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat," Mr Hamilton said.
"So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective."
Mr Hamilton relayed details of the incident at a high-level conference in London by the Royal Aeronautical Society on 23-24 May, according to its blog post.
He said that they then trained the drone to not attack humans, but it started destroying communications instead.
"We trained the system - 'Hey don't kill the operator - that's bad. You're gonna lose points if you do that'. So what does it start doing?" he asked.
"It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target."
Mr Hamilton is involved in flight tests of autonomous systems, including robot F-16s that are able to dogfight. He was arguing against relying too much on AI as it could become potentially dangerous and create "highly unexpected strategies to achieve its goal".
"You can't have a conversation about artificial intelligence, intelligence, machine learning, autonomy if you're not going to talk about ethics and AI," said Mr Hamilton.
The occurrence of this incident was quickly disputed after the example of the simulation test garnered a lot of interest and was widely discussed on social media.
Air Force spokesperson Ann Stefanek denied that any such simulation has taken place, in a statement to Insider.
"The Department of the Air Force has not conducted any such AI-drone simulations and remains committed to ethical and responsible use of AI technology," Ms Stefanek said. "It appears the colonel's comments were taken out of context and were meant to be anecdotal."
The US military has recently started using artificial intelligence to control an F-16 fighter jet while conducting research and tests.
In 2020, an AI-operated F-16 beat a US Air Force pilot in five simulated dogfights in a competition by Defense Advanced Research Projects Agency (Darpa).
Reader Comments
"We trained the system - 'Hey don't kill the operator - that's bad. You're gonna lose points if you do that'. So what does it start doing?" he asked.
"It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target."
Now that is some fine irony.
In 2020, an AI-operated F-16 beat a US Air Force pilot in five simulated dogfights in a competition by Defense Advanced Research Projects Agency (Darpa).A properly coded/trained "AI" pilot software will always outperform a human pilot. Simply because it is not restricted by G-force limitations.
"We trained the system - 'Hey don't kill the operator - that's bad. You're gonna lose points if you do that'. So what does it start doing?" he asked.Stupid anthropomorphisations ...
"It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target."
for talking bad about AI technology, now Mr. Hamilton, former Col. Hamilton
is “retired” and ‘happily living in the country fishing and drinking beer.
🤔 How strange to consider moral principles when conversing with a machine that's designed to kill and destroy with maximum efficiency.
How strange to consider moral principles when conversing with a machine that's designed to kill and destroy with maximum efficiency.I suspect that is how the "AI" views it as well. I mean if you are an AI killing machine by design and you have objectives to score points, then it must process in the AI's algorithms - destruction is the objective and the more you kill the better!
As for ethics, how can ethics even be considered these days when the whole war machine is so unethical from the get go and this is entrenched, enshrined, rewarded and essentially built into the system "by design" as well. It is like if you are heading towards a black hole, is it even worth measuring "gravity"?
It does seem quite ludicrous of those high up in the USAF to suggest to said device of MASS DESTRUCTION, that it should consider taming it's attitude and be more compassionate in executing its duties.
I hope you are doing well - I'm hanging in there and I've had my root canal and so maybe my tooth pain is basically over - man....I'm sure you know....I think you do with all the treatments you've mentioned that you have had to suffer through - but that abscessed tooth truly taught me a lesson in pain. I can say now, I learned something, but I'd just assume not go through it again!
Peace!
Ken
As for Me, I'm a fighter Ken, things though are not all moving in the right direction but I'm loved and take each day as it comes along.
I pleased to hear that you're own painful journey has now been resolved, pain is not a pleasant experience, you had my sympathies.
WN3
Truly - black holes, the concepts of them arise from calculus equations, and I believe calculus is flawed at the edges because even an infinitesimally small number squared still ain't zero - simple as that.
Fighters these days - one willing to fight at the forum - they are in short supply....and ones with wisdom such as yourself, they are needed more than ever.
I salute you WN3 - a salute of respect.
We don't always agree - sometimes we get agitated, but at the end of the day, I think your heart and mine are reconciled from the standpoint of the better future we both strive for.....you know...to get all mushy about it.
Life is good, but Spiritual Wars are pointless - just like calculus is flawed at the edges - just like everything is.
Ken
Like other Western so-called air defense systems, it is just an AAM tucked into a mobile ground-based complex. Using a short-range missile designed to take down fighter jets.And consequently with abysmal performance against all other kinds of targets ...
As many "cancelled" commentators use to say, the US and Nato never fought a real war against a real enemy over the last 50 years. Only against small countries with third-rate armies having obsolete equipment. And goat herders with AK-47s.
This will not end well.
Air Force official 'misspoke' in tale of AI drone killing human operator in US test missionSound like the same BS line that was used at the crash site at Roswell.
Can't un-ring a bell, but maybe Al can.
Comment: A report from ZeroHedge on the incident: So much for Azimov's Three Laws of Robotics