This article is machine translated by Google from Norwegian
Do we really understand the long-term consequences of states' new focus on autonomous weapons and the use of drones?
The drone that the United States last month sent into Iranian territory that was shot down is used politically by Americans as they otherwise propel the world's military-industrial frenzy. At the time of writing, the United States has made the world a more dangerous place to be. They have now succeeded in provoking Iran to launch uranium enrichment for nuclear weapons. Well, today the globe can be wiped out in 7 – 8 minutes. (See also our two articles about Doomsday)
But this is not my point with the title "Out of control". In a public mental climate where one must constantly feel offended and indignant – the mass media contributes psychologically as best they can with such feelings – one is constantly looking for enemies. And with enough enemies or "terrorists" around the next corner, this legitimizes the use of an incredible 16 billion kroner annually to keep the military industry going. In such a climate, new weapon systems such as unmanned drones are also being developed.
For example, the time of hostile spies such as night time swimming ashore from submarines is soon over. The drones take over. The US Navy has developed Heterogeneous Collaborative Unmanned Systems (HCUS), where one or more encapsulated payloads come from the submarines. Or sent shoals of underwater drones waiting to order – to acoustically identify enemy warships and submarines.
As The Economist mentions on June 22, the drones – perhaps the next over Iran – will be able to drop a number of solar-powered sensors, disguised as rocks, which from different positions send back information from the camera and microphone, or intercept radio traffic and signals in the area.
It is also assumed that drones can be made that supply themselves with fuel and weapons – and then repeat actions. Swarms of drones must also be able to cooperate and adapt to situations – in the battle between machine and human.
We already know how unmanned armed drones (such as BlackWing) from the air can annihilate vehicles, local weapons arsenal, or designated individuals. Unlike living spies that can be captured on hostile territory, drones are cheap "consumables" that are difficult to track the sender.
Prohibition. Researchers, businesses and politicians have recently signed a campaign to develop deadly autonomous weapons – so-called killing robots. Toby Walsh (Professor, University of New South Wales), who visited OsloMet University in June, told us that 4500 active researchers in artificial intelligence and robots have signed – as well as almost 30 others. And Oslo Met's vice-rector Morten Irgens said that here in Norway we already have 000 signatures of academics, from university directors to researchers in the field. The hope is that just as chemical weapons, landmines and cluster munitions were banned, a ban on deadly autonomous weapons could be next.
“Machines with power and discretion to take human lives are political
unacceptable, morally reprehensible and should be prohibited by international law. "
As UN Secretary-General António Guterres points out, artificial intelligence could trigger a new arms race – out of control: "Machines with the power and discretion to take human lives are politically unacceptable, morally reprehensible and should be banned by international law." At the UN, 28 countries, including Austria, Brazil and Chile, have taken the lead in promoting a ban – which was typically blocked by the United States, Israel and Russia, among others.
At the same time, some believe that machines that fight effectively can be better than humans characterized by aggression and fear. And what if we created algorithms that relied on the Geneva Convention on Inhuman Weapons and Other Laws, so that such warfare could save civilian life? More control? Yes, why not allow "killer robots" wars with one another? Or maybe let enemies play their wars on the chessboard?
But as the seminar concluded, it is not the way the world's leaders are waging war. However, Walsh hopes that Oslo will be able to lead again – as we started the campaign against cluster weapons in 2007.
Military analyst Cecilie Hellestveit was at OsloMet doubting whether a ban is as applicable as it was with cluster weapons. She is a member of the Norwegian Petroleum Fund's Ethics Council, involved in connection with the Fund's investments and which of these within the military-industrial complex Norway should withdraw from.
No, you can't easily ban artificial intelligence, so more reviews are needed. And to my question to Walsh if he really believed that states with military power interests would listen to such a ban from below, he mentions Google's employees who protested, with the civilians as role models. They stopped Google's Pentagon project. And if you look for, you can also see that Google introduced "ethical rules" for artificial intelligence control: In addition to serving socially beneficial purposes and the right to privacy, Google will neither work with weapons, "technologies that harm, or violate international law and human rights," or "surveillance technology that violates international norms."
But what if the military's new weapons could be returned to us in the future – from non-state actors? Autonomous weapons programmed to carry out actions that are not reversible, will one day end up in the hands of terrorists or militant groups out of control. What comes from the government and the military will also be a means of "striking back".
25 years ago, the IRA attacked Heathrow airport with bombs on three different days. No one died, even though the airport was closed for a few hours. Extinction Rebellion (XR) last month threatened the same airport with sending in cheap 1000 kroner drones to prevent air traffic – where 1300 planes fly daily with 220 passengers. The action was a protest against the construction of a third runway. Well, the day before the action date of June 000, they canceled the action – but threaten that it may come again later. Dangerous? XR suggested that they should only fly a few meters above the ground in the five kilometer zone…
Can an airport really protect itself from such civil disobedience of intended sabotage, which can also lead to the loss of innocent human lives? However, today there are over 200 antidrone systems to detect and track such. It uses advanced radio jamming, electronic drone hijacking, or tethered nets – and even attacking birds like hawks and eagles.
And other civil actions? Drones can be equipped for more innocent spraying of graffiti, but also for firefighting and with mounted handguns. In Venezuela, someone tried to kill the president with one. And according to The Economist (June 15.6), an activist landed a drone of radioactive material on the Japanese prime minister's property – without being discovered in a couple of weeks.
The self-managed autonomous weapons will take time to develop – so we still have some time before this spreads. If we care about prohibition.