Blog Post

‘Kalashnikovs of tomorrow’: Hawking, Musk, Wozniak warn of global AI arms race

150728 HMW

Artificial intelligence, especially weaponized, is a huge mistake, tech gurus and scientific geniuses warn in a newly-released open letter. Professor Stephen Hawking, Apple co-founder Steve Wozniak and Tesla CEO Elon Musk are among 1,000 signatories.

The luminaries believe that for the last 20 years we have been dangerously preoccupied with making strides in the direction of autonomous AI that has decision-making powers.

“Artificial Intelligence (AI) technology has reached a point where the deployment of [autonomous] systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms,”

The danger has to do with a combination of abilities and improvements the machines have been gaining over the years, including various integration, learning, processing power and a number of component tasks, as evidenced in their previous letter published by the Future of Life Institute.

The researchers acknowledge that sending robots into war could produce the positive result of reducing casualties. The flipside, they write, is that it “[lowers] the threshold for going to battle.”

The authors have no doubt that AI weaponry development by any nation will result in a “virtually inevitable global arms race” with autonomous weapons becoming “the Kalashnikovs of tomorrow.”

“Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce,” they write.

Nothing will then stop all this from reaching the black market, researchers argue. “Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group,” they continue.

The message is not that AI is bad in and of itself, but that there’s a danger of using it in a way that is diametrically opposed to how it should be used – which is for good, for healing and for alleviating human suffering.

The group speaks against a select few who would “tarnish” a field that, like biology or chemistry, isn’t at all preoccupied with using that knowledge to make weapons. Any negative use of this knowledge could only lead the public to forever stand against the field, therefore denying it the chance to really be of benefit in the future, the visionaries write.

Full article at Russia Today and full letter here from www.futureoflife.org

 

150728 AI Gun