Stephen Hawking
© Featureflash/ShutterstockStephen Hawking.
Author, physicist and cosmologist Stephen Hawking co-wrote a dire warning to the human race about the danger of unintended consequences from our current fascination with artificial intelligence (AI).

In an open letter with three other scientists published in The Independent, Hawking contended that dismissing "the notion of highly intelligent machines as science fiction" could be "our worst mistake in history."

Consumers use AI each day in the form of the iPhone's personal assistant Siri, Google Now and other programs.

Breakthroughs like newly developed self-driving cars herald a generation of products and services to come as computers become more and more adept at solving problems quickly.

"The potential benefits are huge," wrote Hawking. "Everything that civilization has to offer is a product of human intelligence; we cannot predict what we might achieve when this intelligence is magnified by the tools that AI may provide, but the eradication of war, disease, and poverty would be high on anyone's list. Success in creating AI would be the biggest event in human history."

However, the letter warned, these developments do not come without risks and dangers. Already, defense firms are exploring the use of fully autonomous weapons that are sent into the field to track and kill specific targets.

"Looking further ahead, there are no fundamental limits to what can be achieved: there is no physical law precluding particles from being organized in ways that perform even more advanced computations than the arrangements of particles in human brains," said Hawking, meaning that our creations could learn to think, program and modify themselves faster and faster, making possible an "explosive transition" in which "machines with superhuman intelligence could repeatedly improve their design even further, triggering what Vernor Vinge called a 'singularity' and Johnny Depp's movie character calls 'transcendence.'"

Whoever owns such machines would have the potential for "outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand."

"Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues," warned Hawking. "All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks."